Mar 18 11:56:33 crc systemd[1]: Starting Kubernetes Kubelet... Mar 18 11:56:33 crc restorecon[4706]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 11:56:33 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 11:56:34 crc restorecon[4706]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 11:56:34 crc restorecon[4706]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 18 11:56:35 crc kubenswrapper[4965]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 11:56:35 crc kubenswrapper[4965]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 18 11:56:35 crc kubenswrapper[4965]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 11:56:35 crc kubenswrapper[4965]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 11:56:35 crc kubenswrapper[4965]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 18 11:56:35 crc kubenswrapper[4965]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.583095 4965 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.595924 4965 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.595962 4965 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.595972 4965 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.595981 4965 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.595990 4965 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.595998 4965 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596006 4965 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596015 4965 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596024 4965 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596032 4965 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596041 4965 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596049 4965 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596057 4965 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596065 4965 feature_gate.go:330] unrecognized feature gate: Example Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596073 4965 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596082 4965 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596090 4965 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596097 4965 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596105 4965 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596113 4965 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596125 4965 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596136 4965 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596144 4965 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596152 4965 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596160 4965 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596168 4965 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596176 4965 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596183 4965 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596191 4965 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596199 4965 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596206 4965 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596214 4965 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596235 4965 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596243 4965 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596252 4965 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596259 4965 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596267 4965 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596274 4965 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596282 4965 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596293 4965 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596303 4965 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596312 4965 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596322 4965 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596330 4965 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596338 4965 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596347 4965 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596355 4965 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596363 4965 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596371 4965 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596379 4965 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596386 4965 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596394 4965 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596402 4965 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596410 4965 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596421 4965 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596431 4965 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596439 4965 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596447 4965 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596455 4965 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596463 4965 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596474 4965 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596483 4965 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596491 4965 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596500 4965 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596508 4965 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596516 4965 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596524 4965 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596534 4965 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596542 4965 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596550 4965 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.596558 4965 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.596742 4965 flags.go:64] FLAG: --address="0.0.0.0" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.596759 4965 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.596781 4965 flags.go:64] FLAG: --anonymous-auth="true" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.596792 4965 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.596804 4965 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.596814 4965 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.596826 4965 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.596837 4965 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.596847 4965 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.596855 4965 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.596866 4965 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.596887 4965 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.596897 4965 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.596906 4965 flags.go:64] FLAG: --cgroup-root="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.596915 4965 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.596925 4965 flags.go:64] FLAG: --client-ca-file="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.596934 4965 flags.go:64] FLAG: --cloud-config="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.596943 4965 flags.go:64] FLAG: --cloud-provider="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.596951 4965 flags.go:64] FLAG: --cluster-dns="[]" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597270 4965 flags.go:64] FLAG: --cluster-domain="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597280 4965 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597289 4965 flags.go:64] FLAG: --config-dir="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597298 4965 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597308 4965 flags.go:64] FLAG: --container-log-max-files="5" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597319 4965 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597328 4965 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597337 4965 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597347 4965 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597356 4965 flags.go:64] FLAG: --contention-profiling="false" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597365 4965 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597374 4965 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597384 4965 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597393 4965 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597404 4965 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597415 4965 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597424 4965 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597434 4965 flags.go:64] FLAG: --enable-load-reader="false" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597444 4965 flags.go:64] FLAG: --enable-server="true" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597453 4965 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597470 4965 flags.go:64] FLAG: --event-burst="100" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597480 4965 flags.go:64] FLAG: --event-qps="50" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597489 4965 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597499 4965 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597508 4965 flags.go:64] FLAG: --eviction-hard="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597518 4965 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597527 4965 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597536 4965 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597561 4965 flags.go:64] FLAG: --eviction-soft="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597571 4965 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597580 4965 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597589 4965 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597599 4965 flags.go:64] FLAG: --experimental-mounter-path="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597608 4965 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597617 4965 flags.go:64] FLAG: --fail-swap-on="true" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597626 4965 flags.go:64] FLAG: --feature-gates="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597637 4965 flags.go:64] FLAG: --file-check-frequency="20s" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597646 4965 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597660 4965 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597695 4965 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597705 4965 flags.go:64] FLAG: --healthz-port="10248" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597714 4965 flags.go:64] FLAG: --help="false" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597723 4965 flags.go:64] FLAG: --hostname-override="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597733 4965 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597742 4965 flags.go:64] FLAG: --http-check-frequency="20s" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597751 4965 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597760 4965 flags.go:64] FLAG: --image-credential-provider-config="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597769 4965 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597778 4965 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597787 4965 flags.go:64] FLAG: --image-service-endpoint="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597796 4965 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597805 4965 flags.go:64] FLAG: --kube-api-burst="100" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597814 4965 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597824 4965 flags.go:64] FLAG: --kube-api-qps="50" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597833 4965 flags.go:64] FLAG: --kube-reserved="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597842 4965 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597851 4965 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597861 4965 flags.go:64] FLAG: --kubelet-cgroups="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597872 4965 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597881 4965 flags.go:64] FLAG: --lock-file="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597890 4965 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597899 4965 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597908 4965 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597929 4965 flags.go:64] FLAG: --log-json-split-stream="false" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597954 4965 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597963 4965 flags.go:64] FLAG: --log-text-split-stream="false" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597972 4965 flags.go:64] FLAG: --logging-format="text" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597981 4965 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.597991 4965 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598000 4965 flags.go:64] FLAG: --manifest-url="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598009 4965 flags.go:64] FLAG: --manifest-url-header="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598020 4965 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598029 4965 flags.go:64] FLAG: --max-open-files="1000000" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598040 4965 flags.go:64] FLAG: --max-pods="110" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598050 4965 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598059 4965 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598068 4965 flags.go:64] FLAG: --memory-manager-policy="None" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598077 4965 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598086 4965 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598095 4965 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598104 4965 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598125 4965 flags.go:64] FLAG: --node-status-max-images="50" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598134 4965 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598143 4965 flags.go:64] FLAG: --oom-score-adj="-999" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598152 4965 flags.go:64] FLAG: --pod-cidr="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598161 4965 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598176 4965 flags.go:64] FLAG: --pod-manifest-path="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598185 4965 flags.go:64] FLAG: --pod-max-pids="-1" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598194 4965 flags.go:64] FLAG: --pods-per-core="0" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598204 4965 flags.go:64] FLAG: --port="10250" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598213 4965 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598222 4965 flags.go:64] FLAG: --provider-id="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598231 4965 flags.go:64] FLAG: --qos-reserved="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598240 4965 flags.go:64] FLAG: --read-only-port="10255" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598249 4965 flags.go:64] FLAG: --register-node="true" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598259 4965 flags.go:64] FLAG: --register-schedulable="true" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598267 4965 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598284 4965 flags.go:64] FLAG: --registry-burst="10" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598293 4965 flags.go:64] FLAG: --registry-qps="5" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598302 4965 flags.go:64] FLAG: --reserved-cpus="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598323 4965 flags.go:64] FLAG: --reserved-memory="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598334 4965 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598343 4965 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598352 4965 flags.go:64] FLAG: --rotate-certificates="false" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598361 4965 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598372 4965 flags.go:64] FLAG: --runonce="false" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598381 4965 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598390 4965 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598400 4965 flags.go:64] FLAG: --seccomp-default="false" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598409 4965 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598419 4965 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598429 4965 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598439 4965 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598448 4965 flags.go:64] FLAG: --storage-driver-password="root" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598457 4965 flags.go:64] FLAG: --storage-driver-secure="false" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598467 4965 flags.go:64] FLAG: --storage-driver-table="stats" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598476 4965 flags.go:64] FLAG: --storage-driver-user="root" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598486 4965 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598495 4965 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598504 4965 flags.go:64] FLAG: --system-cgroups="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598513 4965 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598527 4965 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598535 4965 flags.go:64] FLAG: --tls-cert-file="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598544 4965 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598561 4965 flags.go:64] FLAG: --tls-min-version="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598570 4965 flags.go:64] FLAG: --tls-private-key-file="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598579 4965 flags.go:64] FLAG: --topology-manager-policy="none" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598587 4965 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598596 4965 flags.go:64] FLAG: --topology-manager-scope="container" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598608 4965 flags.go:64] FLAG: --v="2" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598619 4965 flags.go:64] FLAG: --version="false" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598632 4965 flags.go:64] FLAG: --vmodule="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598643 4965 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.598652 4965 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.598897 4965 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.598908 4965 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.598929 4965 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.598938 4965 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.598946 4965 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.598954 4965 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.598961 4965 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.598969 4965 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.598978 4965 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.598986 4965 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.598994 4965 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599002 4965 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599010 4965 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599018 4965 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599026 4965 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599034 4965 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599044 4965 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599053 4965 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599065 4965 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599073 4965 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599081 4965 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599089 4965 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599099 4965 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599110 4965 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599118 4965 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599126 4965 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599137 4965 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599149 4965 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599157 4965 feature_gate.go:330] unrecognized feature gate: Example Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599167 4965 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599175 4965 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599184 4965 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599192 4965 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599200 4965 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599209 4965 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599217 4965 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599225 4965 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599232 4965 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599243 4965 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599252 4965 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599260 4965 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599268 4965 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599276 4965 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599284 4965 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599292 4965 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599299 4965 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599307 4965 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599315 4965 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599323 4965 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599334 4965 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599347 4965 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599356 4965 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599366 4965 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599375 4965 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599384 4965 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599393 4965 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599403 4965 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599413 4965 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599422 4965 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599432 4965 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599441 4965 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599449 4965 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599457 4965 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599465 4965 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599472 4965 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599481 4965 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599489 4965 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599497 4965 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599505 4965 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599513 4965 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.599521 4965 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.599575 4965 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.613838 4965 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.613900 4965 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614052 4965 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614078 4965 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614089 4965 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614099 4965 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614111 4965 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614121 4965 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614134 4965 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614148 4965 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614158 4965 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614168 4965 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614177 4965 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614186 4965 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614195 4965 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614203 4965 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614211 4965 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614220 4965 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614228 4965 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614237 4965 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614245 4965 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614254 4965 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614262 4965 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614271 4965 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614280 4965 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614288 4965 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614298 4965 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614309 4965 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614319 4965 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614333 4965 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614345 4965 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614356 4965 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614366 4965 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614375 4965 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614386 4965 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614394 4965 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614405 4965 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614414 4965 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614423 4965 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614432 4965 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614441 4965 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614450 4965 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614459 4965 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614468 4965 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614477 4965 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614485 4965 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614494 4965 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614503 4965 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614512 4965 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614521 4965 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614530 4965 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614538 4965 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614547 4965 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614556 4965 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614565 4965 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614573 4965 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614582 4965 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614590 4965 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614599 4965 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614610 4965 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614618 4965 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614626 4965 feature_gate.go:330] unrecognized feature gate: Example Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614635 4965 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614643 4965 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614658 4965 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614666 4965 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614719 4965 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614730 4965 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614740 4965 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614750 4965 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614759 4965 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614767 4965 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.614780 4965 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.614794 4965 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615103 4965 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615118 4965 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615128 4965 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615137 4965 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615146 4965 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615155 4965 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615163 4965 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615171 4965 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615180 4965 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615206 4965 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615215 4965 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615223 4965 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615231 4965 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615240 4965 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615248 4965 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615256 4965 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615264 4965 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615274 4965 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615283 4965 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615292 4965 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615300 4965 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615308 4965 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615317 4965 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615327 4965 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615338 4965 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615347 4965 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615356 4965 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615366 4965 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615375 4965 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615384 4965 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615392 4965 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615402 4965 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615411 4965 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615420 4965 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615440 4965 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615449 4965 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615458 4965 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615466 4965 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615475 4965 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615486 4965 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615497 4965 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615508 4965 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615520 4965 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615529 4965 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615538 4965 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615547 4965 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615555 4965 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615564 4965 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615575 4965 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615585 4965 feature_gate.go:330] unrecognized feature gate: Example Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615594 4965 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615603 4965 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615612 4965 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615621 4965 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615630 4965 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615639 4965 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615647 4965 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615661 4965 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615690 4965 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615699 4965 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615707 4965 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615716 4965 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615724 4965 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615733 4965 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615741 4965 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615750 4965 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615758 4965 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615767 4965 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615775 4965 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615784 4965 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.615820 4965 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.615834 4965 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.616161 4965 server.go:940] "Client rotation is on, will bootstrap in background" Mar 18 11:56:35 crc kubenswrapper[4965]: E0318 11:56:35.640250 4965 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.645518 4965 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.645705 4965 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.671425 4965 server.go:997] "Starting client certificate rotation" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.671480 4965 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.671789 4965 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.824423 4965 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.827285 4965 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 11:56:35 crc kubenswrapper[4965]: E0318 11:56:35.827784 4965 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.849935 4965 log.go:25] "Validated CRI v1 runtime API" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.881936 4965 log.go:25] "Validated CRI v1 image API" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.883826 4965 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.889947 4965 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-18-11-51-41-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.889994 4965 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:41 fsType:tmpfs blockSize:0}] Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.924203 4965 manager.go:217] Machine: {Timestamp:2026-03-18 11:56:35.919209683 +0000 UTC m=+0.905397252 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:f21bc216-4db6-44fa-8f07-5ffceb9c90c0 BootID:6bb909a7-2031-4da1-8950-7746d364df6b Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:41 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:e4:5b:fa Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:e4:5b:fa Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:04:85:06 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:99:5c:a5 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:8f:87:b4 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:57:49:5a Speed:-1 Mtu:1496} {Name:eth10 MacAddress:e6:64:c3:22:d1:7e Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:82:cd:62:44:85:0a Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.924577 4965 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.924805 4965 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.926945 4965 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.927297 4965 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.927365 4965 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.927809 4965 topology_manager.go:138] "Creating topology manager with none policy" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.927829 4965 container_manager_linux.go:303] "Creating device plugin manager" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.928568 4965 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.928601 4965 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.928988 4965 state_mem.go:36] "Initialized new in-memory state store" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.929124 4965 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.933265 4965 kubelet.go:418] "Attempting to sync node with API server" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.933297 4965 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.933336 4965 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.933355 4965 kubelet.go:324] "Adding apiserver pod source" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.933372 4965 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.940351 4965 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.941626 4965 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.943387 4965 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.943797 4965 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 18 11:56:35 crc kubenswrapper[4965]: E0318 11:56:35.944156 4965 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.944217 4965 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 18 11:56:35 crc kubenswrapper[4965]: E0318 11:56:35.944323 4965 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.946040 4965 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.946086 4965 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.946101 4965 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.946116 4965 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.946138 4965 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.946151 4965 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.946164 4965 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.946185 4965 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.946202 4965 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.946218 4965 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.946236 4965 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.946249 4965 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.948350 4965 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.949072 4965 server.go:1280] "Started kubelet" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.949240 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.950145 4965 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.950143 4965 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.950765 4965 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 18 11:56:35 crc systemd[1]: Started Kubernetes Kubelet. Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.959286 4965 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.959378 4965 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 18 11:56:35 crc kubenswrapper[4965]: E0318 11:56:35.959772 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.959814 4965 server.go:460] "Adding debug handlers to kubelet server" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.959974 4965 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.959998 4965 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.960136 4965 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 18 11:56:35 crc kubenswrapper[4965]: E0318 11:56:35.960917 4965 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="200ms" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.962828 4965 factory.go:55] Registering systemd factory Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.962857 4965 factory.go:221] Registration of the systemd container factory successfully Mar 18 11:56:35 crc kubenswrapper[4965]: W0318 11:56:35.965893 4965 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 18 11:56:35 crc kubenswrapper[4965]: E0318 11:56:35.966028 4965 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.968114 4965 factory.go:153] Registering CRI-O factory Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.968155 4965 factory.go:221] Registration of the crio container factory successfully Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.968258 4965 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.968295 4965 factory.go:103] Registering Raw factory Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.968321 4965 manager.go:1196] Started watching for new ooms in manager Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.969028 4965 manager.go:319] Starting recovery of all containers Mar 18 11:56:35 crc kubenswrapper[4965]: E0318 11:56:35.968183 4965 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.138:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189ded87d7b278c6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:35.949009094 +0000 UTC m=+0.935196583,LastTimestamp:2026-03-18 11:56:35.949009094 +0000 UTC m=+0.935196583,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.981239 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.981332 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.981348 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.981362 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.981374 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.981389 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.981406 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.981420 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.981435 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.981447 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.981463 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.981513 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.981527 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.981542 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.981554 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.981565 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.981577 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.981589 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.981603 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.984754 4965 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.984791 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.984806 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.984822 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.984835 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.984849 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.984862 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.984875 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.984891 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.984907 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.984920 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.984934 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.984947 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.984971 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.984984 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.984996 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985010 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985026 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985039 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985055 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985068 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985081 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985095 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985113 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985127 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985140 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985156 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985170 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985184 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985198 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985211 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985224 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985239 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985251 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985268 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985283 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985297 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985310 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985323 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985336 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985350 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985363 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985374 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985387 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985401 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985414 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985428 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985441 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985454 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985470 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985487 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985850 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985865 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985879 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985897 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985931 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985946 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985958 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985969 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985981 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.985993 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.986005 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.986017 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.986029 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.986041 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.986054 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.986065 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.986078 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.986090 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.986101 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.986113 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.986126 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.986138 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.986150 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.986161 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.986172 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.986185 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.986198 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.986210 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.986223 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.986236 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.986249 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.986263 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.986279 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.986290 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.986303 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.986320 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.986334 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.986348 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.986361 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.986374 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.986388 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.986402 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.986414 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.986427 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.986440 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.986451 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.986462 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.986475 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.986487 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.986498 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.987210 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.987350 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.987420 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.987456 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.987507 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.987542 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.987575 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.987619 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.987721 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.987778 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.987812 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.987840 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.987885 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.987915 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.987956 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.987989 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.988019 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.988067 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.988096 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.988127 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.988163 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.988184 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.988212 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.988236 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.988257 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.988289 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.988313 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.988353 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.988384 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.988414 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.988454 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.988491 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.988527 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.990473 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.990502 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.990533 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.990554 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.990572 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.990595 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.990610 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.990634 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.990651 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.990694 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.990719 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.990734 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.990755 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.990770 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.990787 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.990807 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.990827 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.990848 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.990869 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.990885 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.990908 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.990927 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.990949 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.990967 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.990985 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.991007 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.991025 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.991047 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.991070 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.991087 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.991109 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.991126 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.991146 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.991168 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.991184 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.991206 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.991222 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.991271 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.991304 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.991322 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.991336 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.991350 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.991366 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.991379 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.991397 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.991413 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.991426 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.991442 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.991455 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.991472 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.991485 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.991498 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.991520 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.991539 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.991559 4965 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.991584 4965 reconstruct.go:97] "Volume reconstruction finished" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.991597 4965 reconciler.go:26] "Reconciler: start to sync state" Mar 18 11:56:35 crc kubenswrapper[4965]: I0318 11:56:35.993638 4965 manager.go:324] Recovery completed Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.004429 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.006866 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.006904 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.006915 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.010046 4965 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.010073 4965 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.010100 4965 state_mem.go:36] "Initialized new in-memory state store" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.016019 4965 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.019570 4965 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.019610 4965 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.019635 4965 kubelet.go:2335] "Starting kubelet main sync loop" Mar 18 11:56:36 crc kubenswrapper[4965]: E0318 11:56:36.019828 4965 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 18 11:56:36 crc kubenswrapper[4965]: W0318 11:56:36.021595 4965 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 18 11:56:36 crc kubenswrapper[4965]: E0318 11:56:36.021695 4965 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.030159 4965 policy_none.go:49] "None policy: Start" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.032192 4965 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.032241 4965 state_mem.go:35] "Initializing new in-memory state store" Mar 18 11:56:36 crc kubenswrapper[4965]: E0318 11:56:36.060596 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.086466 4965 manager.go:334] "Starting Device Plugin manager" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.086638 4965 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.086676 4965 server.go:79] "Starting device plugin registration server" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.087122 4965 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.087148 4965 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.087444 4965 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.087554 4965 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.087569 4965 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 18 11:56:36 crc kubenswrapper[4965]: E0318 11:56:36.100313 4965 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.120834 4965 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.120995 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.124307 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.124348 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.124357 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.124517 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.124973 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.125165 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.126390 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.126550 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.126707 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.127013 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.127173 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.127238 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.127285 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.127334 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.127355 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.129038 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.129062 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.129072 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.129115 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.129137 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.129160 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.129295 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.129427 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.129458 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.130394 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.130418 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.130429 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.130630 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.130643 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.130657 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.130797 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.131205 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.131245 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.131989 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.132014 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.132035 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.132020 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.132046 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.132136 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.132399 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.132448 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.133372 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.133403 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.133414 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:36 crc kubenswrapper[4965]: E0318 11:56:36.162200 4965 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="400ms" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.187647 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.188891 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.188922 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.188931 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.188952 4965 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 11:56:36 crc kubenswrapper[4965]: E0318 11:56:36.189386 4965 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.138:6443: connect: connection refused" node="crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.194722 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.194760 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.194784 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.194807 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.194835 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.194858 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.194903 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.194936 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.194954 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.194987 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.195016 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.195039 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.195053 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.195071 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.195085 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.295790 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.296270 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.296406 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.296157 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.296478 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.296999 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.297131 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.296846 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.297158 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.297469 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.297630 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.297802 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.297336 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.297810 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.297542 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.297874 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.297945 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.298102 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.298154 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.298199 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.298243 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.298292 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.298343 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.298708 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.298938 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.298968 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.299017 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.299063 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.299108 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.299188 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.389954 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.391832 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.391885 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.391902 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.391934 4965 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 11:56:36 crc kubenswrapper[4965]: E0318 11:56:36.392595 4965 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.138:6443: connect: connection refused" node="crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.453692 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.475543 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.486504 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: W0318 11:56:36.501999 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-f633f682783abc47f64fd0f2bdd34ba9c975317e24bf6b81b32b2018f8356a40 WatchSource:0}: Error finding container f633f682783abc47f64fd0f2bdd34ba9c975317e24bf6b81b32b2018f8356a40: Status 404 returned error can't find the container with id f633f682783abc47f64fd0f2bdd34ba9c975317e24bf6b81b32b2018f8356a40 Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.506318 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.511803 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 11:56:36 crc kubenswrapper[4965]: W0318 11:56:36.516762 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-185444a946b2d9424af49be993431aeb4f3e44b7d76e2ceb09bb7d92cf550c7f WatchSource:0}: Error finding container 185444a946b2d9424af49be993431aeb4f3e44b7d76e2ceb09bb7d92cf550c7f: Status 404 returned error can't find the container with id 185444a946b2d9424af49be993431aeb4f3e44b7d76e2ceb09bb7d92cf550c7f Mar 18 11:56:36 crc kubenswrapper[4965]: W0318 11:56:36.527391 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-85b906892b3742a9d0b101801e8f8a821cd70f01f5b6e44cde49dde262721b9c WatchSource:0}: Error finding container 85b906892b3742a9d0b101801e8f8a821cd70f01f5b6e44cde49dde262721b9c: Status 404 returned error can't find the container with id 85b906892b3742a9d0b101801e8f8a821cd70f01f5b6e44cde49dde262721b9c Mar 18 11:56:36 crc kubenswrapper[4965]: W0318 11:56:36.529255 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-c0772cc23dbe1b96d2dfd46c2da8d7cf5724cacad1c3b024af2be8de9f109454 WatchSource:0}: Error finding container c0772cc23dbe1b96d2dfd46c2da8d7cf5724cacad1c3b024af2be8de9f109454: Status 404 returned error can't find the container with id c0772cc23dbe1b96d2dfd46c2da8d7cf5724cacad1c3b024af2be8de9f109454 Mar 18 11:56:36 crc kubenswrapper[4965]: W0318 11:56:36.539789 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-80e40e004bdf08f9f50fcd488949331b5699377e5d07edcebfda539bceda8149 WatchSource:0}: Error finding container 80e40e004bdf08f9f50fcd488949331b5699377e5d07edcebfda539bceda8149: Status 404 returned error can't find the container with id 80e40e004bdf08f9f50fcd488949331b5699377e5d07edcebfda539bceda8149 Mar 18 11:56:36 crc kubenswrapper[4965]: E0318 11:56:36.563783 4965 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="800ms" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.792937 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.794459 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.794496 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.794509 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.794533 4965 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 11:56:36 crc kubenswrapper[4965]: E0318 11:56:36.795041 4965 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.138:6443: connect: connection refused" node="crc" Mar 18 11:56:36 crc kubenswrapper[4965]: W0318 11:56:36.843603 4965 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 18 11:56:36 crc kubenswrapper[4965]: E0318 11:56:36.843728 4965 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Mar 18 11:56:36 crc kubenswrapper[4965]: W0318 11:56:36.925768 4965 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 18 11:56:36 crc kubenswrapper[4965]: E0318 11:56:36.925867 4965 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Mar 18 11:56:36 crc kubenswrapper[4965]: I0318 11:56:36.950591 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 18 11:56:37 crc kubenswrapper[4965]: I0318 11:56:37.027300 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"f633f682783abc47f64fd0f2bdd34ba9c975317e24bf6b81b32b2018f8356a40"} Mar 18 11:56:37 crc kubenswrapper[4965]: I0318 11:56:37.028273 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"80e40e004bdf08f9f50fcd488949331b5699377e5d07edcebfda539bceda8149"} Mar 18 11:56:37 crc kubenswrapper[4965]: I0318 11:56:37.029112 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c0772cc23dbe1b96d2dfd46c2da8d7cf5724cacad1c3b024af2be8de9f109454"} Mar 18 11:56:37 crc kubenswrapper[4965]: I0318 11:56:37.030078 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"85b906892b3742a9d0b101801e8f8a821cd70f01f5b6e44cde49dde262721b9c"} Mar 18 11:56:37 crc kubenswrapper[4965]: I0318 11:56:37.030898 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"185444a946b2d9424af49be993431aeb4f3e44b7d76e2ceb09bb7d92cf550c7f"} Mar 18 11:56:37 crc kubenswrapper[4965]: W0318 11:56:37.234157 4965 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 18 11:56:37 crc kubenswrapper[4965]: E0318 11:56:37.234255 4965 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Mar 18 11:56:37 crc kubenswrapper[4965]: W0318 11:56:37.331118 4965 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 18 11:56:37 crc kubenswrapper[4965]: E0318 11:56:37.331213 4965 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Mar 18 11:56:37 crc kubenswrapper[4965]: E0318 11:56:37.364591 4965 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="1.6s" Mar 18 11:56:37 crc kubenswrapper[4965]: I0318 11:56:37.595908 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:37 crc kubenswrapper[4965]: I0318 11:56:37.600385 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:37 crc kubenswrapper[4965]: I0318 11:56:37.600440 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:37 crc kubenswrapper[4965]: I0318 11:56:37.600464 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:37 crc kubenswrapper[4965]: I0318 11:56:37.600509 4965 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 11:56:37 crc kubenswrapper[4965]: E0318 11:56:37.601253 4965 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.138:6443: connect: connection refused" node="crc" Mar 18 11:56:37 crc kubenswrapper[4965]: I0318 11:56:37.887120 4965 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 11:56:37 crc kubenswrapper[4965]: E0318 11:56:37.888443 4965 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Mar 18 11:56:37 crc kubenswrapper[4965]: I0318 11:56:37.951033 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 18 11:56:38 crc kubenswrapper[4965]: I0318 11:56:38.036542 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"31fe6c510eec759d3e51ed22ed8c67a3e2ad0102b926bf349d4b431dc82c8b71"} Mar 18 11:56:38 crc kubenswrapper[4965]: I0318 11:56:38.036605 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9dd304c138d70e2bab50741090aace5284f04e2d25fe04cf9188a91b6976f720"} Mar 18 11:56:38 crc kubenswrapper[4965]: I0318 11:56:38.039219 4965 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8686a109c054bacaff991779f6488d86d2eecb88f13b5e9e83e1680f4022af83" exitCode=0 Mar 18 11:56:38 crc kubenswrapper[4965]: I0318 11:56:38.039341 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"8686a109c054bacaff991779f6488d86d2eecb88f13b5e9e83e1680f4022af83"} Mar 18 11:56:38 crc kubenswrapper[4965]: I0318 11:56:38.039399 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:38 crc kubenswrapper[4965]: I0318 11:56:38.040640 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:38 crc kubenswrapper[4965]: I0318 11:56:38.040706 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:38 crc kubenswrapper[4965]: I0318 11:56:38.040722 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:38 crc kubenswrapper[4965]: I0318 11:56:38.041397 4965 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="0c517e496707ec5c650cc338ad6de538664a4c6779e43cc5842bcc856a407c07" exitCode=0 Mar 18 11:56:38 crc kubenswrapper[4965]: I0318 11:56:38.041433 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"0c517e496707ec5c650cc338ad6de538664a4c6779e43cc5842bcc856a407c07"} Mar 18 11:56:38 crc kubenswrapper[4965]: I0318 11:56:38.041578 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:38 crc kubenswrapper[4965]: I0318 11:56:38.042715 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:38 crc kubenswrapper[4965]: I0318 11:56:38.042741 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:38 crc kubenswrapper[4965]: I0318 11:56:38.042750 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:38 crc kubenswrapper[4965]: I0318 11:56:38.042838 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:38 crc kubenswrapper[4965]: I0318 11:56:38.042906 4965 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="fc7b3a59b7b14c11fac67e8405aa72568dc064dc43cb5b21b730df7ba7490005" exitCode=0 Mar 18 11:56:38 crc kubenswrapper[4965]: I0318 11:56:38.043007 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"fc7b3a59b7b14c11fac67e8405aa72568dc064dc43cb5b21b730df7ba7490005"} Mar 18 11:56:38 crc kubenswrapper[4965]: I0318 11:56:38.043054 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:38 crc kubenswrapper[4965]: I0318 11:56:38.044511 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:38 crc kubenswrapper[4965]: I0318 11:56:38.044561 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:38 crc kubenswrapper[4965]: I0318 11:56:38.044580 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:38 crc kubenswrapper[4965]: I0318 11:56:38.045233 4965 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="dcec54333b6f645dc3ecf296217de2f6fffa037c98543a962be09347198823b3" exitCode=0 Mar 18 11:56:38 crc kubenswrapper[4965]: I0318 11:56:38.045281 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"dcec54333b6f645dc3ecf296217de2f6fffa037c98543a962be09347198823b3"} Mar 18 11:56:38 crc kubenswrapper[4965]: I0318 11:56:38.045377 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:38 crc kubenswrapper[4965]: I0318 11:56:38.045539 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:38 crc kubenswrapper[4965]: I0318 11:56:38.045573 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:38 crc kubenswrapper[4965]: I0318 11:56:38.045583 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:38 crc kubenswrapper[4965]: I0318 11:56:38.046489 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:38 crc kubenswrapper[4965]: I0318 11:56:38.046512 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:38 crc kubenswrapper[4965]: I0318 11:56:38.046522 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:38 crc kubenswrapper[4965]: E0318 11:56:38.413809 4965 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.138:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189ded87d7b278c6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:35.949009094 +0000 UTC m=+0.935196583,LastTimestamp:2026-03-18 11:56:35.949009094 +0000 UTC m=+0.935196583,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:56:38 crc kubenswrapper[4965]: W0318 11:56:38.640123 4965 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 18 11:56:38 crc kubenswrapper[4965]: E0318 11:56:38.640245 4965 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Mar 18 11:56:38 crc kubenswrapper[4965]: W0318 11:56:38.850756 4965 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 18 11:56:38 crc kubenswrapper[4965]: E0318 11:56:38.850840 4965 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Mar 18 11:56:38 crc kubenswrapper[4965]: I0318 11:56:38.950337 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 18 11:56:38 crc kubenswrapper[4965]: E0318 11:56:38.965453 4965 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="3.2s" Mar 18 11:56:39 crc kubenswrapper[4965]: I0318 11:56:39.051044 4965 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="3f5c1e460ac0826657789ab61c4e7a471a52198601aa042304e7c5263fadff42" exitCode=0 Mar 18 11:56:39 crc kubenswrapper[4965]: I0318 11:56:39.051283 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"3f5c1e460ac0826657789ab61c4e7a471a52198601aa042304e7c5263fadff42"} Mar 18 11:56:39 crc kubenswrapper[4965]: I0318 11:56:39.051337 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:39 crc kubenswrapper[4965]: I0318 11:56:39.052558 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:39 crc kubenswrapper[4965]: I0318 11:56:39.052601 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:39 crc kubenswrapper[4965]: I0318 11:56:39.052612 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:39 crc kubenswrapper[4965]: I0318 11:56:39.054422 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"7fc3062a71936faa73fd7e61d6492631481b7a670f5c18df7142b6d84822ffef"} Mar 18 11:56:39 crc kubenswrapper[4965]: I0318 11:56:39.054714 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:39 crc kubenswrapper[4965]: I0318 11:56:39.056593 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:39 crc kubenswrapper[4965]: I0318 11:56:39.056627 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:39 crc kubenswrapper[4965]: I0318 11:56:39.056636 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:39 crc kubenswrapper[4965]: I0318 11:56:39.058964 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"cbf8c6c01ff22dd5a8926b21a21b6984686a4a14952bdcd2b7ffc29763ce0aaa"} Mar 18 11:56:39 crc kubenswrapper[4965]: I0318 11:56:39.059021 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"640b641ab7078cf8af599f51885c1fb7f687c4d55b86905836fd62ae1e465948"} Mar 18 11:56:39 crc kubenswrapper[4965]: I0318 11:56:39.063039 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0ae5dcf08a3d0646fdf03e5fc18040f616522bdb59159203a0a61f97dce8c0be"} Mar 18 11:56:39 crc kubenswrapper[4965]: I0318 11:56:39.063074 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"060950b9975ea55e0ef6fc8f34e3501096a0cec86b45fdc41d0b344ba6e77716"} Mar 18 11:56:39 crc kubenswrapper[4965]: I0318 11:56:39.063146 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:39 crc kubenswrapper[4965]: I0318 11:56:39.063939 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:39 crc kubenswrapper[4965]: I0318 11:56:39.063958 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:39 crc kubenswrapper[4965]: I0318 11:56:39.063966 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:39 crc kubenswrapper[4965]: I0318 11:56:39.065937 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"90185fb5121b4cb1ffd94f24cce668838aa69c89a757de56246a7e9e3254005a"} Mar 18 11:56:39 crc kubenswrapper[4965]: I0318 11:56:39.065959 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1489cfbddc1072ee2238c82e981b8dd676b935c235de66cb53613a2479018238"} Mar 18 11:56:39 crc kubenswrapper[4965]: I0318 11:56:39.201804 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:39 crc kubenswrapper[4965]: I0318 11:56:39.203369 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:39 crc kubenswrapper[4965]: I0318 11:56:39.203414 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:39 crc kubenswrapper[4965]: I0318 11:56:39.203423 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:39 crc kubenswrapper[4965]: I0318 11:56:39.203449 4965 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 11:56:39 crc kubenswrapper[4965]: E0318 11:56:39.204086 4965 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.138:6443: connect: connection refused" node="crc" Mar 18 11:56:39 crc kubenswrapper[4965]: W0318 11:56:39.697250 4965 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 18 11:56:39 crc kubenswrapper[4965]: E0318 11:56:39.697353 4965 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Mar 18 11:56:39 crc kubenswrapper[4965]: W0318 11:56:39.765198 4965 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 18 11:56:39 crc kubenswrapper[4965]: E0318 11:56:39.765277 4965 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Mar 18 11:56:39 crc kubenswrapper[4965]: I0318 11:56:39.949855 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 18 11:56:40 crc kubenswrapper[4965]: I0318 11:56:40.069930 4965 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="24d1caed073f6cf92e36489352ee5e9115555ad91ed8b98838a2dbd9ba83a512" exitCode=0 Mar 18 11:56:40 crc kubenswrapper[4965]: I0318 11:56:40.069997 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"24d1caed073f6cf92e36489352ee5e9115555ad91ed8b98838a2dbd9ba83a512"} Mar 18 11:56:40 crc kubenswrapper[4965]: I0318 11:56:40.070021 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:40 crc kubenswrapper[4965]: I0318 11:56:40.071314 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:40 crc kubenswrapper[4965]: I0318 11:56:40.071345 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:40 crc kubenswrapper[4965]: I0318 11:56:40.071354 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:40 crc kubenswrapper[4965]: I0318 11:56:40.073772 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"cfe10935c0fa7bc755ecb20010a71b608e73b8131d1f47bc2bcba492ae5a0008"} Mar 18 11:56:40 crc kubenswrapper[4965]: I0318 11:56:40.073796 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:40 crc kubenswrapper[4965]: I0318 11:56:40.078353 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:40 crc kubenswrapper[4965]: I0318 11:56:40.078388 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:40 crc kubenswrapper[4965]: I0318 11:56:40.078401 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:40 crc kubenswrapper[4965]: I0318 11:56:40.088438 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:40 crc kubenswrapper[4965]: I0318 11:56:40.089103 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:40 crc kubenswrapper[4965]: I0318 11:56:40.089561 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"610d4310c1dbcd4e186234109f44e78d1c4e499125fc938c96bde7c44f0a029d"} Mar 18 11:56:40 crc kubenswrapper[4965]: I0318 11:56:40.089609 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e6be9268fda0866f34859b014813011535f476e4a3004010d537f23df27440f1"} Mar 18 11:56:40 crc kubenswrapper[4965]: I0318 11:56:40.089628 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"35983903bdfe8136c0d115201f4bfb23009d1cb8019ea7f9b647614eb8b27afe"} Mar 18 11:56:40 crc kubenswrapper[4965]: I0318 11:56:40.089742 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:40 crc kubenswrapper[4965]: I0318 11:56:40.091066 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:40 crc kubenswrapper[4965]: I0318 11:56:40.091103 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:40 crc kubenswrapper[4965]: I0318 11:56:40.091121 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:40 crc kubenswrapper[4965]: I0318 11:56:40.091062 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:40 crc kubenswrapper[4965]: I0318 11:56:40.091403 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:40 crc kubenswrapper[4965]: I0318 11:56:40.091515 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:40 crc kubenswrapper[4965]: I0318 11:56:40.092098 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:40 crc kubenswrapper[4965]: I0318 11:56:40.092134 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:40 crc kubenswrapper[4965]: I0318 11:56:40.092149 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:41 crc kubenswrapper[4965]: I0318 11:56:41.067906 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 11:56:41 crc kubenswrapper[4965]: I0318 11:56:41.080105 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 11:56:41 crc kubenswrapper[4965]: I0318 11:56:41.100742 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4d2c74b7ec10c55934d60abb2e387d5e4ec01e850766dd2f9b3bea5b4d7e689a"} Mar 18 11:56:41 crc kubenswrapper[4965]: I0318 11:56:41.100790 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:41 crc kubenswrapper[4965]: I0318 11:56:41.100802 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8d4f89ed850f436b4e160be036c9783095d5ea5beea287863a00f1545e91ee01"} Mar 18 11:56:41 crc kubenswrapper[4965]: I0318 11:56:41.100818 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"65b14c7e3a847a05bdad58eefa71ee2244aaf3bdcf55439ff5deb5f3f3024433"} Mar 18 11:56:41 crc kubenswrapper[4965]: I0318 11:56:41.100839 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 11:56:41 crc kubenswrapper[4965]: I0318 11:56:41.100850 4965 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 11:56:41 crc kubenswrapper[4965]: I0318 11:56:41.100934 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:41 crc kubenswrapper[4965]: I0318 11:56:41.100855 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8ba45b451c3688dfa7c6378ffc265fc114dc6cc3a1ee642c8ef508386bca700c"} Mar 18 11:56:41 crc kubenswrapper[4965]: I0318 11:56:41.100790 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:41 crc kubenswrapper[4965]: I0318 11:56:41.102227 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:41 crc kubenswrapper[4965]: I0318 11:56:41.102254 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:41 crc kubenswrapper[4965]: I0318 11:56:41.102269 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:41 crc kubenswrapper[4965]: I0318 11:56:41.102875 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:41 crc kubenswrapper[4965]: I0318 11:56:41.102917 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:41 crc kubenswrapper[4965]: I0318 11:56:41.102937 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:41 crc kubenswrapper[4965]: I0318 11:56:41.102986 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:41 crc kubenswrapper[4965]: I0318 11:56:41.103020 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:41 crc kubenswrapper[4965]: I0318 11:56:41.103038 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:42 crc kubenswrapper[4965]: I0318 11:56:42.037957 4965 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 11:56:42 crc kubenswrapper[4965]: I0318 11:56:42.109303 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e018b82c9d8e6d7170068dc7f3ac04839c34ef9a2e0702f5bcce3bb56a4a29a4"} Mar 18 11:56:42 crc kubenswrapper[4965]: I0318 11:56:42.109407 4965 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 11:56:42 crc kubenswrapper[4965]: I0318 11:56:42.109484 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:42 crc kubenswrapper[4965]: I0318 11:56:42.109412 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:42 crc kubenswrapper[4965]: I0318 11:56:42.109490 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:42 crc kubenswrapper[4965]: I0318 11:56:42.110852 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:42 crc kubenswrapper[4965]: I0318 11:56:42.110869 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:42 crc kubenswrapper[4965]: I0318 11:56:42.110908 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:42 crc kubenswrapper[4965]: I0318 11:56:42.110932 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:42 crc kubenswrapper[4965]: I0318 11:56:42.110929 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:42 crc kubenswrapper[4965]: I0318 11:56:42.110969 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:42 crc kubenswrapper[4965]: I0318 11:56:42.111412 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:42 crc kubenswrapper[4965]: I0318 11:56:42.111456 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:42 crc kubenswrapper[4965]: I0318 11:56:42.111465 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:42 crc kubenswrapper[4965]: I0318 11:56:42.331995 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 11:56:42 crc kubenswrapper[4965]: I0318 11:56:42.332572 4965 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 11:56:42 crc kubenswrapper[4965]: I0318 11:56:42.332807 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:42 crc kubenswrapper[4965]: I0318 11:56:42.334292 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:42 crc kubenswrapper[4965]: I0318 11:56:42.334345 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:42 crc kubenswrapper[4965]: I0318 11:56:42.334362 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:42 crc kubenswrapper[4965]: I0318 11:56:42.404360 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:42 crc kubenswrapper[4965]: I0318 11:56:42.406238 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:42 crc kubenswrapper[4965]: I0318 11:56:42.406294 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:42 crc kubenswrapper[4965]: I0318 11:56:42.406313 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:42 crc kubenswrapper[4965]: I0318 11:56:42.406351 4965 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 11:56:42 crc kubenswrapper[4965]: I0318 11:56:42.765312 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 18 11:56:43 crc kubenswrapper[4965]: I0318 11:56:43.112874 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:43 crc kubenswrapper[4965]: I0318 11:56:43.114161 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:43 crc kubenswrapper[4965]: I0318 11:56:43.114206 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:43 crc kubenswrapper[4965]: I0318 11:56:43.114215 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:43 crc kubenswrapper[4965]: I0318 11:56:43.782911 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 11:56:43 crc kubenswrapper[4965]: I0318 11:56:43.783099 4965 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 11:56:43 crc kubenswrapper[4965]: I0318 11:56:43.783157 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:43 crc kubenswrapper[4965]: I0318 11:56:43.784698 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:43 crc kubenswrapper[4965]: I0318 11:56:43.784743 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:43 crc kubenswrapper[4965]: I0318 11:56:43.784753 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:44 crc kubenswrapper[4965]: I0318 11:56:44.116007 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:44 crc kubenswrapper[4965]: I0318 11:56:44.117183 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:44 crc kubenswrapper[4965]: I0318 11:56:44.117263 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:44 crc kubenswrapper[4965]: I0318 11:56:44.117325 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:44 crc kubenswrapper[4965]: I0318 11:56:44.360707 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 11:56:44 crc kubenswrapper[4965]: I0318 11:56:44.360898 4965 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 11:56:44 crc kubenswrapper[4965]: I0318 11:56:44.360951 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:44 crc kubenswrapper[4965]: I0318 11:56:44.362362 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:44 crc kubenswrapper[4965]: I0318 11:56:44.362418 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:44 crc kubenswrapper[4965]: I0318 11:56:44.362437 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:44 crc kubenswrapper[4965]: I0318 11:56:44.763644 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 11:56:44 crc kubenswrapper[4965]: I0318 11:56:44.763914 4965 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 11:56:44 crc kubenswrapper[4965]: I0318 11:56:44.763991 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:44 crc kubenswrapper[4965]: I0318 11:56:44.765489 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:44 crc kubenswrapper[4965]: I0318 11:56:44.765544 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:44 crc kubenswrapper[4965]: I0318 11:56:44.765562 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:45 crc kubenswrapper[4965]: I0318 11:56:45.012931 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 11:56:45 crc kubenswrapper[4965]: I0318 11:56:45.119560 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:45 crc kubenswrapper[4965]: I0318 11:56:45.121226 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:45 crc kubenswrapper[4965]: I0318 11:56:45.121449 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:45 crc kubenswrapper[4965]: I0318 11:56:45.121607 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:45 crc kubenswrapper[4965]: I0318 11:56:45.190794 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 11:56:45 crc kubenswrapper[4965]: I0318 11:56:45.191259 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:45 crc kubenswrapper[4965]: I0318 11:56:45.193232 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:45 crc kubenswrapper[4965]: I0318 11:56:45.193489 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:45 crc kubenswrapper[4965]: I0318 11:56:45.193741 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:46 crc kubenswrapper[4965]: E0318 11:56:46.100925 4965 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 11:56:46 crc kubenswrapper[4965]: I0318 11:56:46.783400 4965 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 11:56:46 crc kubenswrapper[4965]: I0318 11:56:46.783508 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 11:56:48 crc kubenswrapper[4965]: I0318 11:56:48.381863 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 18 11:56:48 crc kubenswrapper[4965]: I0318 11:56:48.383081 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:48 crc kubenswrapper[4965]: I0318 11:56:48.384709 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:48 crc kubenswrapper[4965]: I0318 11:56:48.384752 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:48 crc kubenswrapper[4965]: I0318 11:56:48.384764 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:50 crc kubenswrapper[4965]: I0318 11:56:50.833358 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:56:50Z is after 2026-02-23T05:33:13Z Mar 18 11:56:50 crc kubenswrapper[4965]: E0318 11:56:50.834162 4965 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:56:50Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 11:56:50 crc kubenswrapper[4965]: E0318 11:56:50.834676 4965 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:56:50Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189ded87d7b278c6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:35.949009094 +0000 UTC m=+0.935196583,LastTimestamp:2026-03-18 11:56:35.949009094 +0000 UTC m=+0.935196583,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:56:50 crc kubenswrapper[4965]: W0318 11:56:50.835868 4965 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:56:50Z is after 2026-02-23T05:33:13Z Mar 18 11:56:50 crc kubenswrapper[4965]: E0318 11:56:50.836014 4965 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:56:50Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 11:56:50 crc kubenswrapper[4965]: W0318 11:56:50.837402 4965 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:56:50Z is after 2026-02-23T05:33:13Z Mar 18 11:56:50 crc kubenswrapper[4965]: E0318 11:56:50.837437 4965 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:56:50Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 11:56:50 crc kubenswrapper[4965]: W0318 11:56:50.839441 4965 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:56:50Z is after 2026-02-23T05:33:13Z Mar 18 11:56:50 crc kubenswrapper[4965]: E0318 11:56:50.839575 4965 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:56:50Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 11:56:50 crc kubenswrapper[4965]: W0318 11:56:50.842886 4965 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:56:50Z is after 2026-02-23T05:33:13Z Mar 18 11:56:50 crc kubenswrapper[4965]: E0318 11:56:50.843054 4965 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:56:50Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 11:56:50 crc kubenswrapper[4965]: E0318 11:56:50.844011 4965 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:56:50Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 18 11:56:50 crc kubenswrapper[4965]: E0318 11:56:50.850445 4965 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:56:50Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 11:56:50 crc kubenswrapper[4965]: I0318 11:56:50.852801 4965 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 11:56:50 crc kubenswrapper[4965]: I0318 11:56:50.852970 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 18 11:56:50 crc kubenswrapper[4965]: I0318 11:56:50.864448 4965 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 11:56:50 crc kubenswrapper[4965]: I0318 11:56:50.864781 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 18 11:56:50 crc kubenswrapper[4965]: I0318 11:56:50.952776 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:56:50Z is after 2026-02-23T05:33:13Z Mar 18 11:56:51 crc kubenswrapper[4965]: I0318 11:56:51.137766 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 18 11:56:51 crc kubenswrapper[4965]: I0318 11:56:51.140276 4965 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="610d4310c1dbcd4e186234109f44e78d1c4e499125fc938c96bde7c44f0a029d" exitCode=255 Mar 18 11:56:51 crc kubenswrapper[4965]: I0318 11:56:51.140329 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"610d4310c1dbcd4e186234109f44e78d1c4e499125fc938c96bde7c44f0a029d"} Mar 18 11:56:51 crc kubenswrapper[4965]: I0318 11:56:51.140474 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:51 crc kubenswrapper[4965]: I0318 11:56:51.141288 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:51 crc kubenswrapper[4965]: I0318 11:56:51.141346 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:51 crc kubenswrapper[4965]: I0318 11:56:51.141370 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:51 crc kubenswrapper[4965]: I0318 11:56:51.142316 4965 scope.go:117] "RemoveContainer" containerID="610d4310c1dbcd4e186234109f44e78d1c4e499125fc938c96bde7c44f0a029d" Mar 18 11:56:51 crc kubenswrapper[4965]: I0318 11:56:51.961350 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:56:51Z is after 2026-02-23T05:33:13Z Mar 18 11:56:52 crc kubenswrapper[4965]: I0318 11:56:52.143885 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 11:56:52 crc kubenswrapper[4965]: I0318 11:56:52.144302 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 18 11:56:52 crc kubenswrapper[4965]: I0318 11:56:52.146168 4965 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="dfe52b09d92c5afc6422d28285387b0d10baa7eaeaa3822c36f4d7aa61dddec9" exitCode=255 Mar 18 11:56:52 crc kubenswrapper[4965]: I0318 11:56:52.146233 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"dfe52b09d92c5afc6422d28285387b0d10baa7eaeaa3822c36f4d7aa61dddec9"} Mar 18 11:56:52 crc kubenswrapper[4965]: I0318 11:56:52.146271 4965 scope.go:117] "RemoveContainer" containerID="610d4310c1dbcd4e186234109f44e78d1c4e499125fc938c96bde7c44f0a029d" Mar 18 11:56:52 crc kubenswrapper[4965]: I0318 11:56:52.146412 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:52 crc kubenswrapper[4965]: I0318 11:56:52.147278 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:52 crc kubenswrapper[4965]: I0318 11:56:52.147331 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:52 crc kubenswrapper[4965]: I0318 11:56:52.147344 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:52 crc kubenswrapper[4965]: I0318 11:56:52.148051 4965 scope.go:117] "RemoveContainer" containerID="dfe52b09d92c5afc6422d28285387b0d10baa7eaeaa3822c36f4d7aa61dddec9" Mar 18 11:56:52 crc kubenswrapper[4965]: E0318 11:56:52.148259 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 11:56:52 crc kubenswrapper[4965]: I0318 11:56:52.557159 4965 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 11:56:52 crc kubenswrapper[4965]: I0318 11:56:52.956916 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:56:52Z is after 2026-02-23T05:33:13Z Mar 18 11:56:53 crc kubenswrapper[4965]: I0318 11:56:53.157781 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 11:56:53 crc kubenswrapper[4965]: I0318 11:56:53.161581 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:53 crc kubenswrapper[4965]: I0318 11:56:53.162773 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:53 crc kubenswrapper[4965]: I0318 11:56:53.162821 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:53 crc kubenswrapper[4965]: I0318 11:56:53.162838 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:53 crc kubenswrapper[4965]: I0318 11:56:53.163459 4965 scope.go:117] "RemoveContainer" containerID="dfe52b09d92c5afc6422d28285387b0d10baa7eaeaa3822c36f4d7aa61dddec9" Mar 18 11:56:53 crc kubenswrapper[4965]: E0318 11:56:53.163722 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 11:56:53 crc kubenswrapper[4965]: I0318 11:56:53.955460 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:56:53Z is after 2026-02-23T05:33:13Z Mar 18 11:56:54 crc kubenswrapper[4965]: I0318 11:56:54.772619 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 11:56:54 crc kubenswrapper[4965]: I0318 11:56:54.772896 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:54 crc kubenswrapper[4965]: I0318 11:56:54.774583 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:54 crc kubenswrapper[4965]: I0318 11:56:54.774719 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:54 crc kubenswrapper[4965]: I0318 11:56:54.774747 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:54 crc kubenswrapper[4965]: I0318 11:56:54.775563 4965 scope.go:117] "RemoveContainer" containerID="dfe52b09d92c5afc6422d28285387b0d10baa7eaeaa3822c36f4d7aa61dddec9" Mar 18 11:56:54 crc kubenswrapper[4965]: E0318 11:56:54.775893 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 11:56:54 crc kubenswrapper[4965]: I0318 11:56:54.780179 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 11:56:54 crc kubenswrapper[4965]: I0318 11:56:54.956273 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:56:54Z is after 2026-02-23T05:33:13Z Mar 18 11:56:55 crc kubenswrapper[4965]: I0318 11:56:55.013642 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 11:56:55 crc kubenswrapper[4965]: I0318 11:56:55.168864 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:55 crc kubenswrapper[4965]: I0318 11:56:55.171256 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:55 crc kubenswrapper[4965]: I0318 11:56:55.171314 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:55 crc kubenswrapper[4965]: I0318 11:56:55.171333 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:55 crc kubenswrapper[4965]: I0318 11:56:55.172189 4965 scope.go:117] "RemoveContainer" containerID="dfe52b09d92c5afc6422d28285387b0d10baa7eaeaa3822c36f4d7aa61dddec9" Mar 18 11:56:55 crc kubenswrapper[4965]: E0318 11:56:55.172471 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 11:56:55 crc kubenswrapper[4965]: I0318 11:56:55.198543 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 11:56:55 crc kubenswrapper[4965]: I0318 11:56:55.198769 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:55 crc kubenswrapper[4965]: I0318 11:56:55.200354 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:55 crc kubenswrapper[4965]: I0318 11:56:55.200409 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:55 crc kubenswrapper[4965]: I0318 11:56:55.200427 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:55 crc kubenswrapper[4965]: I0318 11:56:55.955690 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:56:55Z is after 2026-02-23T05:33:13Z Mar 18 11:56:56 crc kubenswrapper[4965]: E0318 11:56:56.101226 4965 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 11:56:56 crc kubenswrapper[4965]: I0318 11:56:56.171013 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:56 crc kubenswrapper[4965]: I0318 11:56:56.172118 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:56 crc kubenswrapper[4965]: I0318 11:56:56.172169 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:56 crc kubenswrapper[4965]: I0318 11:56:56.172187 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:56 crc kubenswrapper[4965]: I0318 11:56:56.172985 4965 scope.go:117] "RemoveContainer" containerID="dfe52b09d92c5afc6422d28285387b0d10baa7eaeaa3822c36f4d7aa61dddec9" Mar 18 11:56:56 crc kubenswrapper[4965]: E0318 11:56:56.173256 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 11:56:56 crc kubenswrapper[4965]: I0318 11:56:56.783560 4965 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 11:56:56 crc kubenswrapper[4965]: I0318 11:56:56.783714 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 11:56:56 crc kubenswrapper[4965]: I0318 11:56:56.952281 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:56:56Z is after 2026-02-23T05:33:13Z Mar 18 11:56:57 crc kubenswrapper[4965]: I0318 11:56:57.235149 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:57 crc kubenswrapper[4965]: I0318 11:56:57.236740 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:57 crc kubenswrapper[4965]: I0318 11:56:57.236780 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:57 crc kubenswrapper[4965]: I0318 11:56:57.236791 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:57 crc kubenswrapper[4965]: I0318 11:56:57.236814 4965 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 11:56:57 crc kubenswrapper[4965]: E0318 11:56:57.241753 4965 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:56:57Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 11:56:57 crc kubenswrapper[4965]: E0318 11:56:57.249806 4965 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:56:57Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 18 11:56:57 crc kubenswrapper[4965]: I0318 11:56:57.955641 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:56:57Z is after 2026-02-23T05:33:13Z Mar 18 11:56:58 crc kubenswrapper[4965]: W0318 11:56:58.178182 4965 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:56:58Z is after 2026-02-23T05:33:13Z Mar 18 11:56:58 crc kubenswrapper[4965]: E0318 11:56:58.178286 4965 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:56:58Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 11:56:58 crc kubenswrapper[4965]: I0318 11:56:58.414791 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 18 11:56:58 crc kubenswrapper[4965]: I0318 11:56:58.415034 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:58 crc kubenswrapper[4965]: I0318 11:56:58.416748 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:58 crc kubenswrapper[4965]: I0318 11:56:58.416811 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:58 crc kubenswrapper[4965]: I0318 11:56:58.416835 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:58 crc kubenswrapper[4965]: I0318 11:56:58.429256 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 18 11:56:58 crc kubenswrapper[4965]: W0318 11:56:58.849086 4965 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:56:58Z is after 2026-02-23T05:33:13Z Mar 18 11:56:58 crc kubenswrapper[4965]: E0318 11:56:58.849192 4965 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:56:58Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 11:56:58 crc kubenswrapper[4965]: I0318 11:56:58.955900 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:56:58Z is after 2026-02-23T05:33:13Z Mar 18 11:56:59 crc kubenswrapper[4965]: I0318 11:56:59.179443 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:56:59 crc kubenswrapper[4965]: I0318 11:56:59.180819 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:56:59 crc kubenswrapper[4965]: I0318 11:56:59.180928 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:56:59 crc kubenswrapper[4965]: I0318 11:56:59.180954 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:56:59 crc kubenswrapper[4965]: I0318 11:56:59.476830 4965 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 11:56:59 crc kubenswrapper[4965]: E0318 11:56:59.482222 4965 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:56:59Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 11:56:59 crc kubenswrapper[4965]: I0318 11:56:59.953293 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:56:59Z is after 2026-02-23T05:33:13Z Mar 18 11:57:00 crc kubenswrapper[4965]: W0318 11:57:00.449167 4965 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:57:00Z is after 2026-02-23T05:33:13Z Mar 18 11:57:00 crc kubenswrapper[4965]: E0318 11:57:00.449258 4965 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:57:00Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 11:57:00 crc kubenswrapper[4965]: E0318 11:57:00.840223 4965 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:57:00Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189ded87d7b278c6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:35.949009094 +0000 UTC m=+0.935196583,LastTimestamp:2026-03-18 11:56:35.949009094 +0000 UTC m=+0.935196583,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:00 crc kubenswrapper[4965]: I0318 11:57:00.954976 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:57:00Z is after 2026-02-23T05:33:13Z Mar 18 11:57:01 crc kubenswrapper[4965]: I0318 11:57:01.955420 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:57:01Z is after 2026-02-23T05:33:13Z Mar 18 11:57:02 crc kubenswrapper[4965]: W0318 11:57:02.468179 4965 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:57:02Z is after 2026-02-23T05:33:13Z Mar 18 11:57:02 crc kubenswrapper[4965]: E0318 11:57:02.468275 4965 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:57:02Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 11:57:02 crc kubenswrapper[4965]: I0318 11:57:02.953900 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:57:02Z is after 2026-02-23T05:33:13Z Mar 18 11:57:03 crc kubenswrapper[4965]: I0318 11:57:03.952723 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:57:03Z is after 2026-02-23T05:33:13Z Mar 18 11:57:04 crc kubenswrapper[4965]: I0318 11:57:04.241882 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:57:04 crc kubenswrapper[4965]: I0318 11:57:04.243141 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:57:04 crc kubenswrapper[4965]: I0318 11:57:04.243188 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:57:04 crc kubenswrapper[4965]: I0318 11:57:04.243201 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:57:04 crc kubenswrapper[4965]: I0318 11:57:04.243227 4965 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 11:57:04 crc kubenswrapper[4965]: E0318 11:57:04.246103 4965 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:57:04Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 11:57:04 crc kubenswrapper[4965]: E0318 11:57:04.252817 4965 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:57:04Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 18 11:57:04 crc kubenswrapper[4965]: I0318 11:57:04.956854 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:57:04Z is after 2026-02-23T05:33:13Z Mar 18 11:57:05 crc kubenswrapper[4965]: I0318 11:57:05.953283 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:57:05Z is after 2026-02-23T05:33:13Z Mar 18 11:57:06 crc kubenswrapper[4965]: E0318 11:57:06.101320 4965 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 11:57:06 crc kubenswrapper[4965]: I0318 11:57:06.784775 4965 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 11:57:06 crc kubenswrapper[4965]: I0318 11:57:06.785142 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 11:57:06 crc kubenswrapper[4965]: I0318 11:57:06.785345 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 11:57:06 crc kubenswrapper[4965]: I0318 11:57:06.785592 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:57:06 crc kubenswrapper[4965]: I0318 11:57:06.787007 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:57:06 crc kubenswrapper[4965]: I0318 11:57:06.787249 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:57:06 crc kubenswrapper[4965]: I0318 11:57:06.787431 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:57:06 crc kubenswrapper[4965]: I0318 11:57:06.788277 4965 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"31fe6c510eec759d3e51ed22ed8c67a3e2ad0102b926bf349d4b431dc82c8b71"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 18 11:57:06 crc kubenswrapper[4965]: I0318 11:57:06.788701 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://31fe6c510eec759d3e51ed22ed8c67a3e2ad0102b926bf349d4b431dc82c8b71" gracePeriod=30 Mar 18 11:57:06 crc kubenswrapper[4965]: I0318 11:57:06.953765 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:57:06Z is after 2026-02-23T05:33:13Z Mar 18 11:57:07 crc kubenswrapper[4965]: I0318 11:57:07.203476 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 11:57:07 crc kubenswrapper[4965]: I0318 11:57:07.203876 4965 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="31fe6c510eec759d3e51ed22ed8c67a3e2ad0102b926bf349d4b431dc82c8b71" exitCode=255 Mar 18 11:57:07 crc kubenswrapper[4965]: I0318 11:57:07.203915 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"31fe6c510eec759d3e51ed22ed8c67a3e2ad0102b926bf349d4b431dc82c8b71"} Mar 18 11:57:07 crc kubenswrapper[4965]: I0318 11:57:07.203940 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"829f4fa5a9626b29eb02895443ca64611610fcd7a3446c1c3a393607374bf699"} Mar 18 11:57:07 crc kubenswrapper[4965]: I0318 11:57:07.204024 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:57:07 crc kubenswrapper[4965]: I0318 11:57:07.204972 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:57:07 crc kubenswrapper[4965]: I0318 11:57:07.205000 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:57:07 crc kubenswrapper[4965]: I0318 11:57:07.205009 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:57:07 crc kubenswrapper[4965]: I0318 11:57:07.954986 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:57:07Z is after 2026-02-23T05:33:13Z Mar 18 11:57:08 crc kubenswrapper[4965]: I0318 11:57:08.953729 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:57:08Z is after 2026-02-23T05:33:13Z Mar 18 11:57:09 crc kubenswrapper[4965]: I0318 11:57:09.955333 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:57:09Z is after 2026-02-23T05:33:13Z Mar 18 11:57:10 crc kubenswrapper[4965]: I0318 11:57:10.020390 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:57:10 crc kubenswrapper[4965]: I0318 11:57:10.021857 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:57:10 crc kubenswrapper[4965]: I0318 11:57:10.021908 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:57:10 crc kubenswrapper[4965]: I0318 11:57:10.021924 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:57:10 crc kubenswrapper[4965]: I0318 11:57:10.022627 4965 scope.go:117] "RemoveContainer" containerID="dfe52b09d92c5afc6422d28285387b0d10baa7eaeaa3822c36f4d7aa61dddec9" Mar 18 11:57:10 crc kubenswrapper[4965]: E0318 11:57:10.846636 4965 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:57:10Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189ded87d7b278c6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:35.949009094 +0000 UTC m=+0.935196583,LastTimestamp:2026-03-18 11:56:35.949009094 +0000 UTC m=+0.935196583,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:10 crc kubenswrapper[4965]: I0318 11:57:10.955200 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:57:10Z is after 2026-02-23T05:33:13Z Mar 18 11:57:11 crc kubenswrapper[4965]: I0318 11:57:11.218147 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 11:57:11 crc kubenswrapper[4965]: I0318 11:57:11.219235 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 11:57:11 crc kubenswrapper[4965]: I0318 11:57:11.222085 4965 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c548aa614b02d0d8d1a6dfca392f9f1963a21274554ba500036d15d4cd01627c" exitCode=255 Mar 18 11:57:11 crc kubenswrapper[4965]: I0318 11:57:11.222154 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c548aa614b02d0d8d1a6dfca392f9f1963a21274554ba500036d15d4cd01627c"} Mar 18 11:57:11 crc kubenswrapper[4965]: I0318 11:57:11.222217 4965 scope.go:117] "RemoveContainer" containerID="dfe52b09d92c5afc6422d28285387b0d10baa7eaeaa3822c36f4d7aa61dddec9" Mar 18 11:57:11 crc kubenswrapper[4965]: I0318 11:57:11.222440 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:57:11 crc kubenswrapper[4965]: I0318 11:57:11.223885 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:57:11 crc kubenswrapper[4965]: I0318 11:57:11.223936 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:57:11 crc kubenswrapper[4965]: I0318 11:57:11.223955 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:57:11 crc kubenswrapper[4965]: I0318 11:57:11.224736 4965 scope.go:117] "RemoveContainer" containerID="c548aa614b02d0d8d1a6dfca392f9f1963a21274554ba500036d15d4cd01627c" Mar 18 11:57:11 crc kubenswrapper[4965]: E0318 11:57:11.225027 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 11:57:11 crc kubenswrapper[4965]: I0318 11:57:11.247228 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:57:11 crc kubenswrapper[4965]: I0318 11:57:11.249001 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:57:11 crc kubenswrapper[4965]: I0318 11:57:11.249065 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:57:11 crc kubenswrapper[4965]: I0318 11:57:11.249081 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:57:11 crc kubenswrapper[4965]: I0318 11:57:11.249112 4965 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 11:57:11 crc kubenswrapper[4965]: E0318 11:57:11.254066 4965 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:57:11Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 11:57:11 crc kubenswrapper[4965]: E0318 11:57:11.257813 4965 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:57:11Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 18 11:57:11 crc kubenswrapper[4965]: I0318 11:57:11.956601 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:57:11Z is after 2026-02-23T05:33:13Z Mar 18 11:57:12 crc kubenswrapper[4965]: I0318 11:57:12.227846 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 11:57:12 crc kubenswrapper[4965]: I0318 11:57:12.557286 4965 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 11:57:12 crc kubenswrapper[4965]: I0318 11:57:12.557563 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:57:12 crc kubenswrapper[4965]: I0318 11:57:12.559470 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:57:12 crc kubenswrapper[4965]: I0318 11:57:12.559520 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:57:12 crc kubenswrapper[4965]: I0318 11:57:12.559540 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:57:12 crc kubenswrapper[4965]: I0318 11:57:12.560596 4965 scope.go:117] "RemoveContainer" containerID="c548aa614b02d0d8d1a6dfca392f9f1963a21274554ba500036d15d4cd01627c" Mar 18 11:57:12 crc kubenswrapper[4965]: E0318 11:57:12.560990 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 11:57:12 crc kubenswrapper[4965]: I0318 11:57:12.954937 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:57:12Z is after 2026-02-23T05:33:13Z Mar 18 11:57:13 crc kubenswrapper[4965]: I0318 11:57:13.783782 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 11:57:13 crc kubenswrapper[4965]: I0318 11:57:13.783954 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:57:13 crc kubenswrapper[4965]: I0318 11:57:13.785246 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:57:13 crc kubenswrapper[4965]: I0318 11:57:13.785281 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:57:13 crc kubenswrapper[4965]: I0318 11:57:13.785292 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:57:13 crc kubenswrapper[4965]: I0318 11:57:13.955018 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:57:13Z is after 2026-02-23T05:33:13Z Mar 18 11:57:14 crc kubenswrapper[4965]: W0318 11:57:14.121759 4965 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:57:14Z is after 2026-02-23T05:33:13Z Mar 18 11:57:14 crc kubenswrapper[4965]: E0318 11:57:14.121919 4965 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:57:14Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 11:57:14 crc kubenswrapper[4965]: I0318 11:57:14.360800 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 11:57:14 crc kubenswrapper[4965]: I0318 11:57:14.361044 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:57:14 crc kubenswrapper[4965]: I0318 11:57:14.362619 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:57:14 crc kubenswrapper[4965]: I0318 11:57:14.362725 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:57:14 crc kubenswrapper[4965]: I0318 11:57:14.362745 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:57:14 crc kubenswrapper[4965]: I0318 11:57:14.955388 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:57:14Z is after 2026-02-23T05:33:13Z Mar 18 11:57:15 crc kubenswrapper[4965]: I0318 11:57:15.013480 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 11:57:15 crc kubenswrapper[4965]: I0318 11:57:15.013733 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:57:15 crc kubenswrapper[4965]: I0318 11:57:15.015480 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:57:15 crc kubenswrapper[4965]: I0318 11:57:15.015592 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:57:15 crc kubenswrapper[4965]: I0318 11:57:15.015621 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:57:15 crc kubenswrapper[4965]: I0318 11:57:15.017154 4965 scope.go:117] "RemoveContainer" containerID="c548aa614b02d0d8d1a6dfca392f9f1963a21274554ba500036d15d4cd01627c" Mar 18 11:57:15 crc kubenswrapper[4965]: E0318 11:57:15.017590 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 11:57:15 crc kubenswrapper[4965]: I0318 11:57:15.955856 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:57:15Z is after 2026-02-23T05:33:13Z Mar 18 11:57:16 crc kubenswrapper[4965]: E0318 11:57:16.101733 4965 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 11:57:16 crc kubenswrapper[4965]: I0318 11:57:16.633322 4965 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 11:57:16 crc kubenswrapper[4965]: E0318 11:57:16.637183 4965 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:57:16Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 11:57:16 crc kubenswrapper[4965]: E0318 11:57:16.638430 4965 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 18 11:57:16 crc kubenswrapper[4965]: I0318 11:57:16.784231 4965 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 11:57:16 crc kubenswrapper[4965]: I0318 11:57:16.784396 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 11:57:16 crc kubenswrapper[4965]: I0318 11:57:16.954481 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:57:16Z is after 2026-02-23T05:33:13Z Mar 18 11:57:17 crc kubenswrapper[4965]: I0318 11:57:17.952567 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:57:17Z is after 2026-02-23T05:33:13Z Mar 18 11:57:18 crc kubenswrapper[4965]: W0318 11:57:18.137741 4965 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:57:18Z is after 2026-02-23T05:33:13Z Mar 18 11:57:18 crc kubenswrapper[4965]: E0318 11:57:18.137836 4965 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:57:18Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 11:57:18 crc kubenswrapper[4965]: I0318 11:57:18.254565 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:57:18 crc kubenswrapper[4965]: I0318 11:57:18.255708 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:57:18 crc kubenswrapper[4965]: I0318 11:57:18.255754 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:57:18 crc kubenswrapper[4965]: I0318 11:57:18.255770 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:57:18 crc kubenswrapper[4965]: I0318 11:57:18.255802 4965 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 11:57:18 crc kubenswrapper[4965]: E0318 11:57:18.261480 4965 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:57:18Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 11:57:18 crc kubenswrapper[4965]: E0318 11:57:18.265403 4965 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:57:18Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 18 11:57:18 crc kubenswrapper[4965]: I0318 11:57:18.955694 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:57:18Z is after 2026-02-23T05:33:13Z Mar 18 11:57:19 crc kubenswrapper[4965]: I0318 11:57:19.954354 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:57:19Z is after 2026-02-23T05:33:13Z Mar 18 11:57:20 crc kubenswrapper[4965]: W0318 11:57:20.139114 4965 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:57:20Z is after 2026-02-23T05:33:13Z Mar 18 11:57:20 crc kubenswrapper[4965]: E0318 11:57:20.139220 4965 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:57:20Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 11:57:20 crc kubenswrapper[4965]: E0318 11:57:20.852422 4965 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:57:20Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189ded87d7b278c6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:35.949009094 +0000 UTC m=+0.935196583,LastTimestamp:2026-03-18 11:56:35.949009094 +0000 UTC m=+0.935196583,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:20 crc kubenswrapper[4965]: I0318 11:57:20.955444 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:57:20Z is after 2026-02-23T05:33:13Z Mar 18 11:57:21 crc kubenswrapper[4965]: I0318 11:57:21.953532 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:57:21Z is after 2026-02-23T05:33:13Z Mar 18 11:57:22 crc kubenswrapper[4965]: I0318 11:57:22.952543 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:57:22Z is after 2026-02-23T05:33:13Z Mar 18 11:57:23 crc kubenswrapper[4965]: I0318 11:57:23.955537 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:57:23Z is after 2026-02-23T05:33:13Z Mar 18 11:57:24 crc kubenswrapper[4965]: W0318 11:57:24.193925 4965 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 18 11:57:24 crc kubenswrapper[4965]: E0318 11:57:24.193998 4965 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 18 11:57:24 crc kubenswrapper[4965]: I0318 11:57:24.957197 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 11:57:25 crc kubenswrapper[4965]: I0318 11:57:25.261638 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:57:25 crc kubenswrapper[4965]: I0318 11:57:25.263593 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:57:25 crc kubenswrapper[4965]: I0318 11:57:25.263685 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:57:25 crc kubenswrapper[4965]: I0318 11:57:25.263711 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:57:25 crc kubenswrapper[4965]: I0318 11:57:25.263752 4965 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 11:57:25 crc kubenswrapper[4965]: E0318 11:57:25.272175 4965 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 11:57:25 crc kubenswrapper[4965]: E0318 11:57:25.274653 4965 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 11:57:25 crc kubenswrapper[4965]: I0318 11:57:25.956769 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 11:57:26 crc kubenswrapper[4965]: E0318 11:57:26.102060 4965 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 11:57:26 crc kubenswrapper[4965]: I0318 11:57:26.784164 4965 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 11:57:26 crc kubenswrapper[4965]: I0318 11:57:26.784302 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 11:57:26 crc kubenswrapper[4965]: I0318 11:57:26.957680 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 11:57:27 crc kubenswrapper[4965]: I0318 11:57:27.957378 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 11:57:28 crc kubenswrapper[4965]: I0318 11:57:28.956710 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 11:57:29 crc kubenswrapper[4965]: I0318 11:57:29.008547 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 11:57:29 crc kubenswrapper[4965]: I0318 11:57:29.008789 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:57:29 crc kubenswrapper[4965]: I0318 11:57:29.010139 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:57:29 crc kubenswrapper[4965]: I0318 11:57:29.010202 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:57:29 crc kubenswrapper[4965]: I0318 11:57:29.010220 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:57:29 crc kubenswrapper[4965]: I0318 11:57:29.955784 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 11:57:30 crc kubenswrapper[4965]: I0318 11:57:30.020943 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:57:30 crc kubenswrapper[4965]: I0318 11:57:30.022526 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:57:30 crc kubenswrapper[4965]: I0318 11:57:30.022580 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:57:30 crc kubenswrapper[4965]: I0318 11:57:30.022600 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:57:30 crc kubenswrapper[4965]: I0318 11:57:30.023481 4965 scope.go:117] "RemoveContainer" containerID="c548aa614b02d0d8d1a6dfca392f9f1963a21274554ba500036d15d4cd01627c" Mar 18 11:57:30 crc kubenswrapper[4965]: E0318 11:57:30.023830 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 11:57:30 crc kubenswrapper[4965]: E0318 11:57:30.861402 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ded87d7b278c6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:35.949009094 +0000 UTC m=+0.935196583,LastTimestamp:2026-03-18 11:56:35.949009094 +0000 UTC m=+0.935196583,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:30 crc kubenswrapper[4965]: E0318 11:57:30.864136 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ded87db25c872 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:36.006897778 +0000 UTC m=+0.993085257,LastTimestamp:2026-03-18 11:56:36.006897778 +0000 UTC m=+0.993085257,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:30 crc kubenswrapper[4965]: E0318 11:57:30.869836 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ded87db25fca9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:36.006911145 +0000 UTC m=+0.993098624,LastTimestamp:2026-03-18 11:56:36.006911145 +0000 UTC m=+0.993098624,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:30 crc kubenswrapper[4965]: E0318 11:57:30.871530 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ded87db262546 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:36.006921542 +0000 UTC m=+0.993109021,LastTimestamp:2026-03-18 11:56:36.006921542 +0000 UTC m=+0.993109021,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:30 crc kubenswrapper[4965]: E0318 11:57:30.876369 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ded87e041c663 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:36.092618339 +0000 UTC m=+1.078805828,LastTimestamp:2026-03-18 11:56:36.092618339 +0000 UTC m=+1.078805828,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:30 crc kubenswrapper[4965]: E0318 11:57:30.878604 4965 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ded87db25c872\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ded87db25c872 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:36.006897778 +0000 UTC m=+0.993085257,LastTimestamp:2026-03-18 11:56:36.124340784 +0000 UTC m=+1.110528263,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:30 crc kubenswrapper[4965]: E0318 11:57:30.885574 4965 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ded87db25fca9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ded87db25fca9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:36.006911145 +0000 UTC m=+0.993098624,LastTimestamp:2026-03-18 11:56:36.124354341 +0000 UTC m=+1.110541820,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:30 crc kubenswrapper[4965]: E0318 11:57:30.892850 4965 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ded87db262546\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ded87db262546 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:36.006921542 +0000 UTC m=+0.993109021,LastTimestamp:2026-03-18 11:56:36.124362999 +0000 UTC m=+1.110550488,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:30 crc kubenswrapper[4965]: E0318 11:57:30.900338 4965 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ded87db25c872\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ded87db25c872 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:36.006897778 +0000 UTC m=+0.993085257,LastTimestamp:2026-03-18 11:56:36.126537543 +0000 UTC m=+1.112725062,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:30 crc kubenswrapper[4965]: E0318 11:57:30.906914 4965 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ded87db25fca9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ded87db25fca9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:36.006911145 +0000 UTC m=+0.993098624,LastTimestamp:2026-03-18 11:56:36.126652285 +0000 UTC m=+1.112839804,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:30 crc kubenswrapper[4965]: E0318 11:57:30.913606 4965 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ded87db262546\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ded87db262546 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:36.006921542 +0000 UTC m=+0.993109021,LastTimestamp:2026-03-18 11:56:36.126834181 +0000 UTC m=+1.113021700,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:30 crc kubenswrapper[4965]: E0318 11:57:30.920280 4965 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ded87db25c872\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ded87db25c872 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:36.006897778 +0000 UTC m=+0.993085257,LastTimestamp:2026-03-18 11:56:36.127310766 +0000 UTC m=+1.113498285,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:30 crc kubenswrapper[4965]: E0318 11:57:30.927291 4965 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ded87db25fca9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ded87db25fca9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:36.006911145 +0000 UTC m=+0.993098624,LastTimestamp:2026-03-18 11:56:36.127346917 +0000 UTC m=+1.113534436,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:30 crc kubenswrapper[4965]: E0318 11:57:30.933958 4965 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ded87db262546\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ded87db262546 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:36.006921542 +0000 UTC m=+0.993109021,LastTimestamp:2026-03-18 11:56:36.127366512 +0000 UTC m=+1.113554021,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:30 crc kubenswrapper[4965]: E0318 11:57:30.940960 4965 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ded87db25c872\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ded87db25c872 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:36.006897778 +0000 UTC m=+0.993085257,LastTimestamp:2026-03-18 11:56:36.129056263 +0000 UTC m=+1.115243742,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:30 crc kubenswrapper[4965]: E0318 11:57:30.947990 4965 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ded87db25fca9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ded87db25fca9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:36.006911145 +0000 UTC m=+0.993098624,LastTimestamp:2026-03-18 11:56:36.129067581 +0000 UTC m=+1.115255060,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:30 crc kubenswrapper[4965]: E0318 11:57:30.955168 4965 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ded87db262546\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ded87db262546 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:36.006921542 +0000 UTC m=+0.993109021,LastTimestamp:2026-03-18 11:56:36.129077678 +0000 UTC m=+1.115265157,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:30 crc kubenswrapper[4965]: I0318 11:57:30.955261 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 11:57:30 crc kubenswrapper[4965]: E0318 11:57:30.960088 4965 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ded87db25c872\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ded87db25c872 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:36.006897778 +0000 UTC m=+0.993085257,LastTimestamp:2026-03-18 11:56:36.129130236 +0000 UTC m=+1.115317725,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:30 crc kubenswrapper[4965]: E0318 11:57:30.962906 4965 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ded87db25fca9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ded87db25fca9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:36.006911145 +0000 UTC m=+0.993098624,LastTimestamp:2026-03-18 11:56:36.129144332 +0000 UTC m=+1.115331821,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:30 crc kubenswrapper[4965]: E0318 11:57:30.967231 4965 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ded87db262546\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ded87db262546 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:36.006921542 +0000 UTC m=+0.993109021,LastTimestamp:2026-03-18 11:56:36.129167257 +0000 UTC m=+1.115354746,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:30 crc kubenswrapper[4965]: E0318 11:57:30.970208 4965 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ded87db25c872\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ded87db25c872 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:36.006897778 +0000 UTC m=+0.993085257,LastTimestamp:2026-03-18 11:56:36.130413165 +0000 UTC m=+1.116600664,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:30 crc kubenswrapper[4965]: E0318 11:57:30.972882 4965 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ded87db25fca9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ded87db25fca9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:36.006911145 +0000 UTC m=+0.993098624,LastTimestamp:2026-03-18 11:56:36.130424892 +0000 UTC m=+1.116612381,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:30 crc kubenswrapper[4965]: E0318 11:57:30.978204 4965 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ded87db262546\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ded87db262546 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:36.006921542 +0000 UTC m=+0.993109021,LastTimestamp:2026-03-18 11:56:36.13043519 +0000 UTC m=+1.116622679,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:30 crc kubenswrapper[4965]: E0318 11:57:30.982448 4965 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ded87db25c872\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ded87db25c872 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:36.006897778 +0000 UTC m=+0.993085257,LastTimestamp:2026-03-18 11:56:36.130639091 +0000 UTC m=+1.116826580,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:30 crc kubenswrapper[4965]: E0318 11:57:30.986403 4965 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ded87db25fca9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ded87db25fca9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:36.006911145 +0000 UTC m=+0.993098624,LastTimestamp:2026-03-18 11:56:36.130649258 +0000 UTC m=+1.116836747,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:30 crc kubenswrapper[4965]: E0318 11:57:30.988860 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ded87f976db4e openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:36.515527502 +0000 UTC m=+1.501715011,LastTimestamp:2026-03-18 11:56:36.515527502 +0000 UTC m=+1.501715011,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:30 crc kubenswrapper[4965]: E0318 11:57:30.994185 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ded87f9d0765d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:36.521399901 +0000 UTC m=+1.507587380,LastTimestamp:2026-03-18 11:56:36.521399901 +0000 UTC m=+1.507587380,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.000571 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ded87fa81582e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:36.532992046 +0000 UTC m=+1.519179525,LastTimestamp:2026-03-18 11:56:36.532992046 +0000 UTC m=+1.519179525,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.005487 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ded87fa82b0f1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:36.533080305 +0000 UTC m=+1.519267824,LastTimestamp:2026-03-18 11:56:36.533080305 +0000 UTC m=+1.519267824,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.010038 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ded87fb7b9836 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:36.549392438 +0000 UTC m=+1.535579917,LastTimestamp:2026-03-18 11:56:36.549392438 +0000 UTC m=+1.535579917,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.015222 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ded881d049bd8 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:37.112019928 +0000 UTC m=+2.098207407,LastTimestamp:2026-03-18 11:56:37.112019928 +0000 UTC m=+2.098207407,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.021092 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ded881d105846 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:37.112789062 +0000 UTC m=+2.098976541,LastTimestamp:2026-03-18 11:56:37.112789062 +0000 UTC m=+2.098976541,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.026299 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ded881d140dcf openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:37.113032143 +0000 UTC m=+2.099219622,LastTimestamp:2026-03-18 11:56:37.113032143 +0000 UTC m=+2.099219622,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.032763 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ded881d31c3aa openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:37.114979242 +0000 UTC m=+2.101166721,LastTimestamp:2026-03-18 11:56:37.114979242 +0000 UTC m=+2.101166721,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.039068 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ded881dc05c59 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:37.124324441 +0000 UTC m=+2.110511930,LastTimestamp:2026-03-18 11:56:37.124324441 +0000 UTC m=+2.110511930,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.045589 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ded881de0f200 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:37.126459904 +0000 UTC m=+2.112647413,LastTimestamp:2026-03-18 11:56:37.126459904 +0000 UTC m=+2.112647413,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.052466 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ded881df0f548 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:37.12750932 +0000 UTC m=+2.113696839,LastTimestamp:2026-03-18 11:56:37.12750932 +0000 UTC m=+2.113696839,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.057500 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ded881e032300 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:37.128700672 +0000 UTC m=+2.114888191,LastTimestamp:2026-03-18 11:56:37.128700672 +0000 UTC m=+2.114888191,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.064410 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ded881e2014c5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:37.130597573 +0000 UTC m=+2.116785052,LastTimestamp:2026-03-18 11:56:37.130597573 +0000 UTC m=+2.116785052,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.071622 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ded881e4d0f36 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:37.13354527 +0000 UTC m=+2.119732779,LastTimestamp:2026-03-18 11:56:37.13354527 +0000 UTC m=+2.119732779,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.073000 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ded881eec9a7e openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:37.14400115 +0000 UTC m=+2.130188629,LastTimestamp:2026-03-18 11:56:37.14400115 +0000 UTC m=+2.130188629,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.080404 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ded88373d77ac openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:37.551953836 +0000 UTC m=+2.538141325,LastTimestamp:2026-03-18 11:56:37.551953836 +0000 UTC m=+2.538141325,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.086508 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ded884f07eacd openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:37.951097549 +0000 UTC m=+2.937285028,LastTimestamp:2026-03-18 11:56:37.951097549 +0000 UTC m=+2.937285028,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.092934 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ded884f24a955 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:37.952981333 +0000 UTC m=+2.939168822,LastTimestamp:2026-03-18 11:56:37.952981333 +0000 UTC m=+2.939168822,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.099064 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ded88547ace4a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:38.04251297 +0000 UTC m=+3.028700499,LastTimestamp:2026-03-18 11:56:38.04251297 +0000 UTC m=+3.028700499,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.105361 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ded8854a4d79f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:38.045267871 +0000 UTC m=+3.031455360,LastTimestamp:2026-03-18 11:56:38.045267871 +0000 UTC m=+3.031455360,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.111432 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ded8854ca9649 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:38.047741513 +0000 UTC m=+3.033928992,LastTimestamp:2026-03-18 11:56:38.047741513 +0000 UTC m=+3.033928992,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.117962 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ded8854d4b690 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:38.048405136 +0000 UTC m=+3.034592655,LastTimestamp:2026-03-18 11:56:38.048405136 +0000 UTC m=+3.034592655,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.123957 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ded8869c64029 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:38.399778857 +0000 UTC m=+3.385966375,LastTimestamp:2026-03-18 11:56:38.399778857 +0000 UTC m=+3.385966375,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.130780 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ded8869e15b6d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:38.401555309 +0000 UTC m=+3.387742818,LastTimestamp:2026-03-18 11:56:38.401555309 +0000 UTC m=+3.387742818,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.135640 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ded8869e26922 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:38.401624354 +0000 UTC m=+3.387811863,LastTimestamp:2026-03-18 11:56:38.401624354 +0000 UTC m=+3.387811863,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.142214 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ded8869e7a275 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:38.401966709 +0000 UTC m=+3.388154228,LastTimestamp:2026-03-18 11:56:38.401966709 +0000 UTC m=+3.388154228,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.150645 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ded8869eca325 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:38.402294565 +0000 UTC m=+3.388482084,LastTimestamp:2026-03-18 11:56:38.402294565 +0000 UTC m=+3.388482084,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.157521 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ded88752d3d06 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:38.591077638 +0000 UTC m=+3.577265147,LastTimestamp:2026-03-18 11:56:38.591077638 +0000 UTC m=+3.577265147,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.164683 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ded88753092d4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:38.591296212 +0000 UTC m=+3.577483701,LastTimestamp:2026-03-18 11:56:38.591296212 +0000 UTC m=+3.577483701,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.169957 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ded88754ca670 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:38.59313624 +0000 UTC m=+3.579323729,LastTimestamp:2026-03-18 11:56:38.59313624 +0000 UTC m=+3.579323729,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.176551 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ded887647cfa5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:38.609596325 +0000 UTC m=+3.595783824,LastTimestamp:2026-03-18 11:56:38.609596325 +0000 UTC m=+3.595783824,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.182043 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ded8876510cde openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:38.610201822 +0000 UTC m=+3.596389331,LastTimestamp:2026-03-18 11:56:38.610201822 +0000 UTC m=+3.596389331,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.189091 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ded887653727e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:38.61035891 +0000 UTC m=+3.596546419,LastTimestamp:2026-03-18 11:56:38.61035891 +0000 UTC m=+3.596546419,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.194591 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ded8876634ef2 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:38.611398386 +0000 UTC m=+3.597585875,LastTimestamp:2026-03-18 11:56:38.611398386 +0000 UTC m=+3.597585875,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.204912 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ded88766c6b83 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:38.611995523 +0000 UTC m=+3.598183022,LastTimestamp:2026-03-18 11:56:38.611995523 +0000 UTC m=+3.598183022,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.210201 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ded8883fb4d17 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:38.839463191 +0000 UTC m=+3.825650670,LastTimestamp:2026-03-18 11:56:38.839463191 +0000 UTC m=+3.825650670,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.215914 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ded8885c71213 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:38.869594643 +0000 UTC m=+3.855782112,LastTimestamp:2026-03-18 11:56:38.869594643 +0000 UTC m=+3.855782112,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.220544 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ded8885ca825c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:38.869819996 +0000 UTC m=+3.856007475,LastTimestamp:2026-03-18 11:56:38.869819996 +0000 UTC m=+3.856007475,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.227063 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ded8885caa264 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:38.869828196 +0000 UTC m=+3.856015675,LastTimestamp:2026-03-18 11:56:38.869828196 +0000 UTC m=+3.856015675,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.233769 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ded8887046ed0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:38.890393296 +0000 UTC m=+3.876580775,LastTimestamp:2026-03-18 11:56:38.890393296 +0000 UTC m=+3.876580775,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.240209 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ded8887085db1 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:38.890651057 +0000 UTC m=+3.876838536,LastTimestamp:2026-03-18 11:56:38.890651057 +0000 UTC m=+3.876838536,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.244759 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ded88871d1677 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:38.892009079 +0000 UTC m=+3.878196558,LastTimestamp:2026-03-18 11:56:38.892009079 +0000 UTC m=+3.878196558,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.249473 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ded88871e75cf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:38.892099023 +0000 UTC m=+3.878286502,LastTimestamp:2026-03-18 11:56:38.892099023 +0000 UTC m=+3.878286502,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.254528 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ded8890c3e463 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:39.053935715 +0000 UTC m=+4.040123184,LastTimestamp:2026-03-18 11:56:39.053935715 +0000 UTC m=+4.040123184,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.259847 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ded8892e1538f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:39.089419151 +0000 UTC m=+4.075606630,LastTimestamp:2026-03-18 11:56:39.089419151 +0000 UTC m=+4.075606630,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.265118 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ded88933ca145 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:39.095402821 +0000 UTC m=+4.081590300,LastTimestamp:2026-03-18 11:56:39.095402821 +0000 UTC m=+4.081590300,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.269954 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ded88947108a8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:39.115614376 +0000 UTC m=+4.101801855,LastTimestamp:2026-03-18 11:56:39.115614376 +0000 UTC m=+4.101801855,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.274400 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ded8894808c85 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:39.116631173 +0000 UTC m=+4.102818652,LastTimestamp:2026-03-18 11:56:39.116631173 +0000 UTC m=+4.102818652,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.278475 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ded8894b89a62 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:39.120304738 +0000 UTC m=+4.106492217,LastTimestamp:2026-03-18 11:56:39.120304738 +0000 UTC m=+4.106492217,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.281877 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ded889c6673fe openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:39.249138686 +0000 UTC m=+4.235326155,LastTimestamp:2026-03-18 11:56:39.249138686 +0000 UTC m=+4.235326155,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.283122 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ded889d3a8c65 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:39.263038565 +0000 UTC m=+4.249226044,LastTimestamp:2026-03-18 11:56:39.263038565 +0000 UTC m=+4.249226044,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.286473 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ded889fa9985d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:39.303870557 +0000 UTC m=+4.290058046,LastTimestamp:2026-03-18 11:56:39.303870557 +0000 UTC m=+4.290058046,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.290542 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ded88a07764d0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:39.317357776 +0000 UTC m=+4.303545255,LastTimestamp:2026-03-18 11:56:39.317357776 +0000 UTC m=+4.303545255,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.296100 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ded88a088ad89 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:39.318490505 +0000 UTC m=+4.304677994,LastTimestamp:2026-03-18 11:56:39.318490505 +0000 UTC m=+4.304677994,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.300253 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ded88ac39788b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:39.514626187 +0000 UTC m=+4.500813666,LastTimestamp:2026-03-18 11:56:39.514626187 +0000 UTC m=+4.500813666,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.304407 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ded88ad479c31 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:39.532330033 +0000 UTC m=+4.518517512,LastTimestamp:2026-03-18 11:56:39.532330033 +0000 UTC m=+4.518517512,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.309544 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ded88cd8097ba openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:40.072935354 +0000 UTC m=+5.059122833,LastTimestamp:2026-03-18 11:56:40.072935354 +0000 UTC m=+5.059122833,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.316060 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ded88db7bafc5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:40.307494853 +0000 UTC m=+5.293682332,LastTimestamp:2026-03-18 11:56:40.307494853 +0000 UTC m=+5.293682332,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.321397 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ded88dc2d5f65 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:40.319139685 +0000 UTC m=+5.305327164,LastTimestamp:2026-03-18 11:56:40.319139685 +0000 UTC m=+5.305327164,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.325239 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ded88dc3f5d78 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:40.32031884 +0000 UTC m=+5.306506319,LastTimestamp:2026-03-18 11:56:40.32031884 +0000 UTC m=+5.306506319,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.329796 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ded88e819e6b4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:40.519190196 +0000 UTC m=+5.505377685,LastTimestamp:2026-03-18 11:56:40.519190196 +0000 UTC m=+5.505377685,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.336369 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ded88e9237eba openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:40.536596154 +0000 UTC m=+5.522783643,LastTimestamp:2026-03-18 11:56:40.536596154 +0000 UTC m=+5.522783643,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.341293 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ded88e93fc97c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:40.5384503 +0000 UTC m=+5.524637799,LastTimestamp:2026-03-18 11:56:40.5384503 +0000 UTC m=+5.524637799,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.346091 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ded88f3f5368f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:40.718112399 +0000 UTC m=+5.704299878,LastTimestamp:2026-03-18 11:56:40.718112399 +0000 UTC m=+5.704299878,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.350478 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ded88f494e29a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:40.728576666 +0000 UTC m=+5.714764135,LastTimestamp:2026-03-18 11:56:40.728576666 +0000 UTC m=+5.714764135,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.356717 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ded88f4a2145e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:40.729441374 +0000 UTC m=+5.715628853,LastTimestamp:2026-03-18 11:56:40.729441374 +0000 UTC m=+5.715628853,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.363600 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ded8903947eb8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:40.980209336 +0000 UTC m=+5.966396815,LastTimestamp:2026-03-18 11:56:40.980209336 +0000 UTC m=+5.966396815,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.370195 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ded8904630752 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:40.993744722 +0000 UTC m=+5.979932241,LastTimestamp:2026-03-18 11:56:40.993744722 +0000 UTC m=+5.979932241,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.375106 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ded89047a18b7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:40.995256503 +0000 UTC m=+5.981443982,LastTimestamp:2026-03-18 11:56:40.995256503 +0000 UTC m=+5.981443982,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.381294 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ded89130385be openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:41.23914387 +0000 UTC m=+6.225331359,LastTimestamp:2026-03-18 11:56:41.23914387 +0000 UTC m=+6.225331359,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.385068 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ded8913fc2527 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:41.255437607 +0000 UTC m=+6.241625116,LastTimestamp:2026-03-18 11:56:41.255437607 +0000 UTC m=+6.241625116,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.392989 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 11:57:31 crc kubenswrapper[4965]: &Event{ObjectMeta:{kube-controller-manager-crc.189ded8a5d7b52d8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 18 11:57:31 crc kubenswrapper[4965]: body: Mar 18 11:57:31 crc kubenswrapper[4965]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:46.78347644 +0000 UTC m=+11.769663949,LastTimestamp:2026-03-18 11:56:46.78347644 +0000 UTC m=+11.769663949,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 11:57:31 crc kubenswrapper[4965]: > Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.398907 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ded8a5d7c70e8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:46.783549672 +0000 UTC m=+11.769737181,LastTimestamp:2026-03-18 11:56:46.783549672 +0000 UTC m=+11.769737181,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.404369 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 18 11:57:31 crc kubenswrapper[4965]: &Event{ObjectMeta:{kube-apiserver-crc.189ded8b500a6c98 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 18 11:57:31 crc kubenswrapper[4965]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 11:57:31 crc kubenswrapper[4965]: Mar 18 11:57:31 crc kubenswrapper[4965]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:50.852940952 +0000 UTC m=+15.839128441,LastTimestamp:2026-03-18 11:56:50.852940952 +0000 UTC m=+15.839128441,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 11:57:31 crc kubenswrapper[4965]: > Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.408316 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ded8b500c532b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:50.853065515 +0000 UTC m=+15.839252994,LastTimestamp:2026-03-18 11:56:50.853065515 +0000 UTC m=+15.839252994,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.412924 4965 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ded8b500a6c98\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 18 11:57:31 crc kubenswrapper[4965]: &Event{ObjectMeta:{kube-apiserver-crc.189ded8b500a6c98 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 18 11:57:31 crc kubenswrapper[4965]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 11:57:31 crc kubenswrapper[4965]: Mar 18 11:57:31 crc kubenswrapper[4965]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:50.852940952 +0000 UTC m=+15.839128441,LastTimestamp:2026-03-18 11:56:50.864753226 +0000 UTC m=+15.850940735,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 11:57:31 crc kubenswrapper[4965]: > Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.417195 4965 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ded8b500c532b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ded8b500c532b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:50.853065515 +0000 UTC m=+15.839252994,LastTimestamp:2026-03-18 11:56:50.86488004 +0000 UTC m=+15.851067549,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.421863 4965 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ded88a088ad89\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ded88a088ad89 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:39.318490505 +0000 UTC m=+4.304677994,LastTimestamp:2026-03-18 11:56:51.143452247 +0000 UTC m=+16.129639736,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.426687 4965 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ded88ac39788b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ded88ac39788b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:39.514626187 +0000 UTC m=+4.500813666,LastTimestamp:2026-03-18 11:56:51.386593911 +0000 UTC m=+16.372781400,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.431506 4965 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ded88ad479c31\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ded88ad479c31 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:39.532330033 +0000 UTC m=+4.518517512,LastTimestamp:2026-03-18 11:56:51.419730471 +0000 UTC m=+16.405917950,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.439836 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 11:57:31 crc kubenswrapper[4965]: &Event{ObjectMeta:{kube-controller-manager-crc.189ded8cb189d345 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 11:57:31 crc kubenswrapper[4965]: body: Mar 18 11:57:31 crc kubenswrapper[4965]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:56.783647557 +0000 UTC m=+21.769835066,LastTimestamp:2026-03-18 11:56:56.783647557 +0000 UTC m=+21.769835066,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 11:57:31 crc kubenswrapper[4965]: > Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.444699 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ded8cb18ba100 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:56.78376576 +0000 UTC m=+21.769953269,LastTimestamp:2026-03-18 11:56:56.78376576 +0000 UTC m=+21.769953269,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.449898 4965 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ded8cb189d345\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 11:57:31 crc kubenswrapper[4965]: &Event{ObjectMeta:{kube-controller-manager-crc.189ded8cb189d345 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 11:57:31 crc kubenswrapper[4965]: body: Mar 18 11:57:31 crc kubenswrapper[4965]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:56.783647557 +0000 UTC m=+21.769835066,LastTimestamp:2026-03-18 11:57:06.785101709 +0000 UTC m=+31.771289218,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 11:57:31 crc kubenswrapper[4965]: > Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.454938 4965 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ded8cb18ba100\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ded8cb18ba100 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:56.78376576 +0000 UTC m=+21.769953269,LastTimestamp:2026-03-18 11:57:06.785303745 +0000 UTC m=+31.771491244,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.459821 4965 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ded8f05e1cd1e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:57:06.788633886 +0000 UTC m=+31.774821395,LastTimestamp:2026-03-18 11:57:06.788633886 +0000 UTC m=+31.774821395,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.464585 4965 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ded881e032300\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ded881e032300 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:37.128700672 +0000 UTC m=+2.114888191,LastTimestamp:2026-03-18 11:57:06.912452845 +0000 UTC m=+31.898640334,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.470587 4965 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ded88373d77ac\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ded88373d77ac openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:37.551953836 +0000 UTC m=+2.538141325,LastTimestamp:2026-03-18 11:57:07.136865406 +0000 UTC m=+32.123052915,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.475218 4965 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ded884f07eacd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ded884f07eacd openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:37.951097549 +0000 UTC m=+2.937285028,LastTimestamp:2026-03-18 11:57:07.147607821 +0000 UTC m=+32.133795300,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.481933 4965 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ded8cb189d345\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 11:57:31 crc kubenswrapper[4965]: &Event{ObjectMeta:{kube-controller-manager-crc.189ded8cb189d345 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 11:57:31 crc kubenswrapper[4965]: body: Mar 18 11:57:31 crc kubenswrapper[4965]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:56.783647557 +0000 UTC m=+21.769835066,LastTimestamp:2026-03-18 11:57:16.784374029 +0000 UTC m=+41.770561528,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 11:57:31 crc kubenswrapper[4965]: > Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.486606 4965 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ded8cb18ba100\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ded8cb18ba100 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:56.78376576 +0000 UTC m=+21.769953269,LastTimestamp:2026-03-18 11:57:16.78442352 +0000 UTC m=+41.770611009,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:57:31 crc kubenswrapper[4965]: E0318 11:57:31.489039 4965 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ded8cb189d345\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 11:57:31 crc kubenswrapper[4965]: &Event{ObjectMeta:{kube-controller-manager-crc.189ded8cb189d345 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 11:57:31 crc kubenswrapper[4965]: body: Mar 18 11:57:31 crc kubenswrapper[4965]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:56:56.783647557 +0000 UTC m=+21.769835066,LastTimestamp:2026-03-18 11:57:26.784264425 +0000 UTC m=+51.770451944,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 11:57:31 crc kubenswrapper[4965]: > Mar 18 11:57:31 crc kubenswrapper[4965]: I0318 11:57:31.956905 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 11:57:32 crc kubenswrapper[4965]: I0318 11:57:32.272958 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:57:32 crc kubenswrapper[4965]: I0318 11:57:32.274158 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:57:32 crc kubenswrapper[4965]: I0318 11:57:32.274207 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:57:32 crc kubenswrapper[4965]: I0318 11:57:32.274216 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:57:32 crc kubenswrapper[4965]: I0318 11:57:32.274262 4965 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 11:57:32 crc kubenswrapper[4965]: E0318 11:57:32.278130 4965 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 11:57:32 crc kubenswrapper[4965]: E0318 11:57:32.279626 4965 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 11:57:32 crc kubenswrapper[4965]: I0318 11:57:32.955603 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 11:57:33 crc kubenswrapper[4965]: I0318 11:57:33.790621 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 11:57:33 crc kubenswrapper[4965]: I0318 11:57:33.790860 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:57:33 crc kubenswrapper[4965]: I0318 11:57:33.792196 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:57:33 crc kubenswrapper[4965]: I0318 11:57:33.792239 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:57:33 crc kubenswrapper[4965]: I0318 11:57:33.792256 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:57:33 crc kubenswrapper[4965]: I0318 11:57:33.795029 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 11:57:33 crc kubenswrapper[4965]: I0318 11:57:33.954997 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 11:57:34 crc kubenswrapper[4965]: I0318 11:57:34.292951 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:57:34 crc kubenswrapper[4965]: I0318 11:57:34.293933 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:57:34 crc kubenswrapper[4965]: I0318 11:57:34.293981 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:57:34 crc kubenswrapper[4965]: I0318 11:57:34.293994 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:57:34 crc kubenswrapper[4965]: I0318 11:57:34.957221 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 11:57:35 crc kubenswrapper[4965]: I0318 11:57:35.954861 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 11:57:36 crc kubenswrapper[4965]: E0318 11:57:36.102248 4965 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 11:57:36 crc kubenswrapper[4965]: I0318 11:57:36.953316 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 11:57:37 crc kubenswrapper[4965]: I0318 11:57:37.957300 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 11:57:38 crc kubenswrapper[4965]: I0318 11:57:38.954335 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 11:57:39 crc kubenswrapper[4965]: I0318 11:57:39.279776 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:57:39 crc kubenswrapper[4965]: I0318 11:57:39.281733 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:57:39 crc kubenswrapper[4965]: I0318 11:57:39.281789 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:57:39 crc kubenswrapper[4965]: I0318 11:57:39.281812 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:57:39 crc kubenswrapper[4965]: I0318 11:57:39.281842 4965 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 11:57:39 crc kubenswrapper[4965]: E0318 11:57:39.285816 4965 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 11:57:39 crc kubenswrapper[4965]: E0318 11:57:39.285954 4965 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 11:57:39 crc kubenswrapper[4965]: I0318 11:57:39.957621 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 11:57:40 crc kubenswrapper[4965]: I0318 11:57:40.954365 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 11:57:41 crc kubenswrapper[4965]: I0318 11:57:41.020744 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:57:41 crc kubenswrapper[4965]: I0318 11:57:41.022108 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:57:41 crc kubenswrapper[4965]: I0318 11:57:41.022174 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:57:41 crc kubenswrapper[4965]: I0318 11:57:41.022191 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:57:41 crc kubenswrapper[4965]: I0318 11:57:41.023470 4965 scope.go:117] "RemoveContainer" containerID="c548aa614b02d0d8d1a6dfca392f9f1963a21274554ba500036d15d4cd01627c" Mar 18 11:57:41 crc kubenswrapper[4965]: I0318 11:57:41.313598 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 11:57:41 crc kubenswrapper[4965]: I0318 11:57:41.315522 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"733d0d5c89687c2a3f5a4c5bed039b30b31430fe8959663bb82b04e5c4b6eb76"} Mar 18 11:57:41 crc kubenswrapper[4965]: I0318 11:57:41.315668 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:57:41 crc kubenswrapper[4965]: I0318 11:57:41.316462 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:57:41 crc kubenswrapper[4965]: I0318 11:57:41.316486 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:57:41 crc kubenswrapper[4965]: I0318 11:57:41.316494 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:57:41 crc kubenswrapper[4965]: I0318 11:57:41.953022 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 11:57:42 crc kubenswrapper[4965]: I0318 11:57:42.320169 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 11:57:42 crc kubenswrapper[4965]: I0318 11:57:42.321371 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 11:57:42 crc kubenswrapper[4965]: I0318 11:57:42.323148 4965 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="733d0d5c89687c2a3f5a4c5bed039b30b31430fe8959663bb82b04e5c4b6eb76" exitCode=255 Mar 18 11:57:42 crc kubenswrapper[4965]: I0318 11:57:42.323195 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"733d0d5c89687c2a3f5a4c5bed039b30b31430fe8959663bb82b04e5c4b6eb76"} Mar 18 11:57:42 crc kubenswrapper[4965]: I0318 11:57:42.323250 4965 scope.go:117] "RemoveContainer" containerID="c548aa614b02d0d8d1a6dfca392f9f1963a21274554ba500036d15d4cd01627c" Mar 18 11:57:42 crc kubenswrapper[4965]: I0318 11:57:42.323413 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:57:42 crc kubenswrapper[4965]: I0318 11:57:42.324314 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:57:42 crc kubenswrapper[4965]: I0318 11:57:42.324346 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:57:42 crc kubenswrapper[4965]: I0318 11:57:42.324360 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:57:42 crc kubenswrapper[4965]: I0318 11:57:42.325158 4965 scope.go:117] "RemoveContainer" containerID="733d0d5c89687c2a3f5a4c5bed039b30b31430fe8959663bb82b04e5c4b6eb76" Mar 18 11:57:42 crc kubenswrapper[4965]: E0318 11:57:42.325396 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 11:57:42 crc kubenswrapper[4965]: I0318 11:57:42.557390 4965 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 11:57:42 crc kubenswrapper[4965]: I0318 11:57:42.955318 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 11:57:43 crc kubenswrapper[4965]: I0318 11:57:43.327465 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 11:57:43 crc kubenswrapper[4965]: I0318 11:57:43.330579 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:57:43 crc kubenswrapper[4965]: I0318 11:57:43.332142 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:57:43 crc kubenswrapper[4965]: I0318 11:57:43.332274 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:57:43 crc kubenswrapper[4965]: I0318 11:57:43.332362 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:57:43 crc kubenswrapper[4965]: I0318 11:57:43.333348 4965 scope.go:117] "RemoveContainer" containerID="733d0d5c89687c2a3f5a4c5bed039b30b31430fe8959663bb82b04e5c4b6eb76" Mar 18 11:57:43 crc kubenswrapper[4965]: E0318 11:57:43.333787 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 11:57:43 crc kubenswrapper[4965]: I0318 11:57:43.955008 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 11:57:44 crc kubenswrapper[4965]: I0318 11:57:44.956980 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 11:57:45 crc kubenswrapper[4965]: I0318 11:57:45.013006 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 11:57:45 crc kubenswrapper[4965]: I0318 11:57:45.013272 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:57:45 crc kubenswrapper[4965]: I0318 11:57:45.015250 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:57:45 crc kubenswrapper[4965]: I0318 11:57:45.015307 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:57:45 crc kubenswrapper[4965]: I0318 11:57:45.015325 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:57:45 crc kubenswrapper[4965]: I0318 11:57:45.016161 4965 scope.go:117] "RemoveContainer" containerID="733d0d5c89687c2a3f5a4c5bed039b30b31430fe8959663bb82b04e5c4b6eb76" Mar 18 11:57:45 crc kubenswrapper[4965]: E0318 11:57:45.016430 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 11:57:45 crc kubenswrapper[4965]: I0318 11:57:45.954278 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 11:57:46 crc kubenswrapper[4965]: E0318 11:57:46.103088 4965 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 11:57:46 crc kubenswrapper[4965]: I0318 11:57:46.286404 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:57:46 crc kubenswrapper[4965]: I0318 11:57:46.287616 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:57:46 crc kubenswrapper[4965]: I0318 11:57:46.287676 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:57:46 crc kubenswrapper[4965]: I0318 11:57:46.287690 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:57:46 crc kubenswrapper[4965]: I0318 11:57:46.287714 4965 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 11:57:46 crc kubenswrapper[4965]: E0318 11:57:46.290533 4965 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 11:57:46 crc kubenswrapper[4965]: E0318 11:57:46.290685 4965 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 11:57:46 crc kubenswrapper[4965]: I0318 11:57:46.954208 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 11:57:47 crc kubenswrapper[4965]: I0318 11:57:47.954743 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 11:57:48 crc kubenswrapper[4965]: I0318 11:57:48.640486 4965 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 11:57:48 crc kubenswrapper[4965]: I0318 11:57:48.653381 4965 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 18 11:57:48 crc kubenswrapper[4965]: I0318 11:57:48.954018 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 11:57:49 crc kubenswrapper[4965]: I0318 11:57:49.955073 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 11:57:50 crc kubenswrapper[4965]: I0318 11:57:50.020461 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:57:50 crc kubenswrapper[4965]: I0318 11:57:50.021567 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:57:50 crc kubenswrapper[4965]: I0318 11:57:50.021600 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:57:50 crc kubenswrapper[4965]: I0318 11:57:50.021611 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:57:50 crc kubenswrapper[4965]: I0318 11:57:50.947111 4965 csr.go:261] certificate signing request csr-hr986 is approved, waiting to be issued Mar 18 11:57:50 crc kubenswrapper[4965]: I0318 11:57:50.958277 4965 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 11:57:50 crc kubenswrapper[4965]: I0318 11:57:50.961132 4965 csr.go:257] certificate signing request csr-hr986 is issued Mar 18 11:57:50 crc kubenswrapper[4965]: I0318 11:57:50.974834 4965 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 18 11:57:51 crc kubenswrapper[4965]: I0318 11:57:51.647426 4965 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 18 11:57:51 crc kubenswrapper[4965]: I0318 11:57:51.962924 4965 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-04 01:53:02.117562267 +0000 UTC Mar 18 11:57:51 crc kubenswrapper[4965]: I0318 11:57:51.962983 4965 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6997h55m10.154584569s for next certificate rotation Mar 18 11:57:53 crc kubenswrapper[4965]: I0318 11:57:53.291389 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:57:53 crc kubenswrapper[4965]: I0318 11:57:53.293217 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:57:53 crc kubenswrapper[4965]: I0318 11:57:53.293248 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:57:53 crc kubenswrapper[4965]: I0318 11:57:53.293256 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:57:53 crc kubenswrapper[4965]: I0318 11:57:53.293324 4965 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 11:57:53 crc kubenswrapper[4965]: I0318 11:57:53.301647 4965 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 18 11:57:53 crc kubenswrapper[4965]: I0318 11:57:53.301857 4965 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 18 11:57:53 crc kubenswrapper[4965]: E0318 11:57:53.301876 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 18 11:57:53 crc kubenswrapper[4965]: I0318 11:57:53.305119 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:57:53 crc kubenswrapper[4965]: I0318 11:57:53.305142 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:57:53 crc kubenswrapper[4965]: I0318 11:57:53.305150 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:57:53 crc kubenswrapper[4965]: I0318 11:57:53.305161 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:57:53 crc kubenswrapper[4965]: I0318 11:57:53.305171 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:57:53Z","lastTransitionTime":"2026-03-18T11:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:57:53 crc kubenswrapper[4965]: E0318 11:57:53.317880 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T11:57:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T11:57:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T11:57:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T11:57:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T11:57:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T11:57:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T11:57:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T11:57:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6bb909a7-2031-4da1-8950-7746d364df6b\\\",\\\"systemUUID\\\":\\\"f21bc216-4db6-44fa-8f07-5ffceb9c90c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 11:57:53 crc kubenswrapper[4965]: I0318 11:57:53.328036 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:57:53 crc kubenswrapper[4965]: I0318 11:57:53.328293 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:57:53 crc kubenswrapper[4965]: I0318 11:57:53.329142 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:57:53 crc kubenswrapper[4965]: I0318 11:57:53.330111 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:57:53 crc kubenswrapper[4965]: I0318 11:57:53.330217 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:57:53Z","lastTransitionTime":"2026-03-18T11:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:57:53 crc kubenswrapper[4965]: E0318 11:57:53.342614 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T11:57:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T11:57:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T11:57:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T11:57:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T11:57:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T11:57:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T11:57:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T11:57:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6bb909a7-2031-4da1-8950-7746d364df6b\\\",\\\"systemUUID\\\":\\\"f21bc216-4db6-44fa-8f07-5ffceb9c90c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 11:57:53 crc kubenswrapper[4965]: I0318 11:57:53.350121 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:57:53 crc kubenswrapper[4965]: I0318 11:57:53.350147 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:57:53 crc kubenswrapper[4965]: I0318 11:57:53.350159 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:57:53 crc kubenswrapper[4965]: I0318 11:57:53.350173 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:57:53 crc kubenswrapper[4965]: I0318 11:57:53.350184 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:57:53Z","lastTransitionTime":"2026-03-18T11:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:57:53 crc kubenswrapper[4965]: E0318 11:57:53.362098 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T11:57:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T11:57:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T11:57:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T11:57:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T11:57:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T11:57:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T11:57:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T11:57:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6bb909a7-2031-4da1-8950-7746d364df6b\\\",\\\"systemUUID\\\":\\\"f21bc216-4db6-44fa-8f07-5ffceb9c90c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 11:57:53 crc kubenswrapper[4965]: I0318 11:57:53.368985 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:57:53 crc kubenswrapper[4965]: I0318 11:57:53.369126 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:57:53 crc kubenswrapper[4965]: I0318 11:57:53.369227 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:57:53 crc kubenswrapper[4965]: I0318 11:57:53.369317 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:57:53 crc kubenswrapper[4965]: I0318 11:57:53.369407 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:57:53Z","lastTransitionTime":"2026-03-18T11:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:57:53 crc kubenswrapper[4965]: E0318 11:57:53.382273 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T11:57:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T11:57:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T11:57:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T11:57:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T11:57:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T11:57:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T11:57:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T11:57:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6bb909a7-2031-4da1-8950-7746d364df6b\\\",\\\"systemUUID\\\":\\\"f21bc216-4db6-44fa-8f07-5ffceb9c90c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 11:57:53 crc kubenswrapper[4965]: E0318 11:57:53.382458 4965 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 11:57:53 crc kubenswrapper[4965]: E0318 11:57:53.382494 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:53 crc kubenswrapper[4965]: E0318 11:57:53.482925 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:53 crc kubenswrapper[4965]: E0318 11:57:53.583849 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:53 crc kubenswrapper[4965]: E0318 11:57:53.684526 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:53 crc kubenswrapper[4965]: E0318 11:57:53.785594 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:53 crc kubenswrapper[4965]: E0318 11:57:53.886576 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:53 crc kubenswrapper[4965]: E0318 11:57:53.987781 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:54 crc kubenswrapper[4965]: E0318 11:57:54.089183 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:54 crc kubenswrapper[4965]: E0318 11:57:54.189812 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:54 crc kubenswrapper[4965]: E0318 11:57:54.290238 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:54 crc kubenswrapper[4965]: E0318 11:57:54.391173 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:54 crc kubenswrapper[4965]: E0318 11:57:54.492250 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:54 crc kubenswrapper[4965]: E0318 11:57:54.593296 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:54 crc kubenswrapper[4965]: E0318 11:57:54.694130 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:54 crc kubenswrapper[4965]: E0318 11:57:54.795403 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:54 crc kubenswrapper[4965]: E0318 11:57:54.896680 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:54 crc kubenswrapper[4965]: E0318 11:57:54.997465 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:55 crc kubenswrapper[4965]: E0318 11:57:55.098388 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:55 crc kubenswrapper[4965]: E0318 11:57:55.199010 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:55 crc kubenswrapper[4965]: E0318 11:57:55.299752 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:55 crc kubenswrapper[4965]: E0318 11:57:55.400390 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:55 crc kubenswrapper[4965]: E0318 11:57:55.500734 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:55 crc kubenswrapper[4965]: E0318 11:57:55.601122 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:55 crc kubenswrapper[4965]: E0318 11:57:55.702494 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:55 crc kubenswrapper[4965]: E0318 11:57:55.803715 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:55 crc kubenswrapper[4965]: E0318 11:57:55.904890 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:56 crc kubenswrapper[4965]: E0318 11:57:56.005525 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:56 crc kubenswrapper[4965]: E0318 11:57:56.103694 4965 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 11:57:56 crc kubenswrapper[4965]: E0318 11:57:56.105719 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:56 crc kubenswrapper[4965]: E0318 11:57:56.205830 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:56 crc kubenswrapper[4965]: E0318 11:57:56.307141 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:56 crc kubenswrapper[4965]: E0318 11:57:56.408573 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:56 crc kubenswrapper[4965]: E0318 11:57:56.510329 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:56 crc kubenswrapper[4965]: E0318 11:57:56.610696 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:56 crc kubenswrapper[4965]: E0318 11:57:56.712188 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:56 crc kubenswrapper[4965]: E0318 11:57:56.813324 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:56 crc kubenswrapper[4965]: E0318 11:57:56.913866 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:57 crc kubenswrapper[4965]: E0318 11:57:57.015415 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:57 crc kubenswrapper[4965]: I0318 11:57:57.020825 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:57:57 crc kubenswrapper[4965]: I0318 11:57:57.022388 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:57:57 crc kubenswrapper[4965]: I0318 11:57:57.022552 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:57:57 crc kubenswrapper[4965]: I0318 11:57:57.022641 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:57:57 crc kubenswrapper[4965]: I0318 11:57:57.023370 4965 scope.go:117] "RemoveContainer" containerID="733d0d5c89687c2a3f5a4c5bed039b30b31430fe8959663bb82b04e5c4b6eb76" Mar 18 11:57:57 crc kubenswrapper[4965]: E0318 11:57:57.023647 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 11:57:57 crc kubenswrapper[4965]: E0318 11:57:57.115810 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:57 crc kubenswrapper[4965]: E0318 11:57:57.217147 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:57 crc kubenswrapper[4965]: E0318 11:57:57.318175 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:57 crc kubenswrapper[4965]: E0318 11:57:57.419073 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:57 crc kubenswrapper[4965]: E0318 11:57:57.520282 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:57 crc kubenswrapper[4965]: E0318 11:57:57.620621 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:57 crc kubenswrapper[4965]: E0318 11:57:57.721018 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:57 crc kubenswrapper[4965]: E0318 11:57:57.821434 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:57 crc kubenswrapper[4965]: E0318 11:57:57.921931 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:58 crc kubenswrapper[4965]: E0318 11:57:58.022721 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:58 crc kubenswrapper[4965]: E0318 11:57:58.123696 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:58 crc kubenswrapper[4965]: E0318 11:57:58.224157 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:58 crc kubenswrapper[4965]: E0318 11:57:58.325140 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:58 crc kubenswrapper[4965]: E0318 11:57:58.426267 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:58 crc kubenswrapper[4965]: E0318 11:57:58.526909 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:58 crc kubenswrapper[4965]: E0318 11:57:58.627928 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:58 crc kubenswrapper[4965]: E0318 11:57:58.728263 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:58 crc kubenswrapper[4965]: E0318 11:57:58.828646 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:58 crc kubenswrapper[4965]: E0318 11:57:58.928904 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:59 crc kubenswrapper[4965]: E0318 11:57:59.029284 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:59 crc kubenswrapper[4965]: E0318 11:57:59.129394 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:59 crc kubenswrapper[4965]: E0318 11:57:59.230261 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:59 crc kubenswrapper[4965]: E0318 11:57:59.331469 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:59 crc kubenswrapper[4965]: E0318 11:57:59.432035 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:59 crc kubenswrapper[4965]: E0318 11:57:59.533258 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:59 crc kubenswrapper[4965]: E0318 11:57:59.634370 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:59 crc kubenswrapper[4965]: E0318 11:57:59.734902 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:59 crc kubenswrapper[4965]: E0318 11:57:59.835609 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:57:59 crc kubenswrapper[4965]: E0318 11:57:59.936047 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:00 crc kubenswrapper[4965]: E0318 11:58:00.036713 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:00 crc kubenswrapper[4965]: E0318 11:58:00.137843 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:00 crc kubenswrapper[4965]: E0318 11:58:00.238430 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:00 crc kubenswrapper[4965]: E0318 11:58:00.339108 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:00 crc kubenswrapper[4965]: E0318 11:58:00.439608 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:00 crc kubenswrapper[4965]: E0318 11:58:00.540412 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:00 crc kubenswrapper[4965]: E0318 11:58:00.641611 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:00 crc kubenswrapper[4965]: E0318 11:58:00.741893 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:00 crc kubenswrapper[4965]: E0318 11:58:00.842535 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:00 crc kubenswrapper[4965]: E0318 11:58:00.942763 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:01 crc kubenswrapper[4965]: I0318 11:58:01.020777 4965 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 11:58:01 crc kubenswrapper[4965]: I0318 11:58:01.022087 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:01 crc kubenswrapper[4965]: I0318 11:58:01.022121 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:01 crc kubenswrapper[4965]: I0318 11:58:01.022132 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:01 crc kubenswrapper[4965]: E0318 11:58:01.043773 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:01 crc kubenswrapper[4965]: E0318 11:58:01.144787 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:01 crc kubenswrapper[4965]: E0318 11:58:01.245883 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:01 crc kubenswrapper[4965]: E0318 11:58:01.346857 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:01 crc kubenswrapper[4965]: E0318 11:58:01.447269 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:01 crc kubenswrapper[4965]: E0318 11:58:01.547435 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:01 crc kubenswrapper[4965]: E0318 11:58:01.648124 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:01 crc kubenswrapper[4965]: E0318 11:58:01.748323 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:01 crc kubenswrapper[4965]: I0318 11:58:01.802308 4965 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 18 11:58:01 crc kubenswrapper[4965]: E0318 11:58:01.848747 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:01 crc kubenswrapper[4965]: E0318 11:58:01.948937 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:02 crc kubenswrapper[4965]: E0318 11:58:02.049607 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:02 crc kubenswrapper[4965]: E0318 11:58:02.150156 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:02 crc kubenswrapper[4965]: E0318 11:58:02.251017 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:02 crc kubenswrapper[4965]: E0318 11:58:02.352220 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:02 crc kubenswrapper[4965]: E0318 11:58:02.453328 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:02 crc kubenswrapper[4965]: E0318 11:58:02.553460 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:02 crc kubenswrapper[4965]: E0318 11:58:02.654501 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:02 crc kubenswrapper[4965]: E0318 11:58:02.755406 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:02 crc kubenswrapper[4965]: E0318 11:58:02.856638 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:02 crc kubenswrapper[4965]: E0318 11:58:02.957154 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:03 crc kubenswrapper[4965]: E0318 11:58:03.058134 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:03 crc kubenswrapper[4965]: E0318 11:58:03.158259 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:03 crc kubenswrapper[4965]: E0318 11:58:03.259360 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:03 crc kubenswrapper[4965]: E0318 11:58:03.359962 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:03 crc kubenswrapper[4965]: E0318 11:58:03.460904 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:03 crc kubenswrapper[4965]: E0318 11:58:03.547850 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 18 11:58:03 crc kubenswrapper[4965]: I0318 11:58:03.553359 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:03 crc kubenswrapper[4965]: I0318 11:58:03.553402 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:03 crc kubenswrapper[4965]: I0318 11:58:03.553411 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:03 crc kubenswrapper[4965]: I0318 11:58:03.553427 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:03 crc kubenswrapper[4965]: I0318 11:58:03.553438 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:03Z","lastTransitionTime":"2026-03-18T11:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:03 crc kubenswrapper[4965]: E0318 11:58:03.569123 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T11:58:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T11:58:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T11:58:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T11:58:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6bb909a7-2031-4da1-8950-7746d364df6b\\\",\\\"systemUUID\\\":\\\"f21bc216-4db6-44fa-8f07-5ffceb9c90c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 11:58:03 crc kubenswrapper[4965]: I0318 11:58:03.573766 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:03 crc kubenswrapper[4965]: I0318 11:58:03.573812 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:03 crc kubenswrapper[4965]: I0318 11:58:03.573825 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:03 crc kubenswrapper[4965]: I0318 11:58:03.573845 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:03 crc kubenswrapper[4965]: I0318 11:58:03.573859 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:03Z","lastTransitionTime":"2026-03-18T11:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:03 crc kubenswrapper[4965]: E0318 11:58:03.590299 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T11:58:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T11:58:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T11:58:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T11:58:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6bb909a7-2031-4da1-8950-7746d364df6b\\\",\\\"systemUUID\\\":\\\"f21bc216-4db6-44fa-8f07-5ffceb9c90c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 11:58:03 crc kubenswrapper[4965]: I0318 11:58:03.595310 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:03 crc kubenswrapper[4965]: I0318 11:58:03.595350 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:03 crc kubenswrapper[4965]: I0318 11:58:03.595368 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:03 crc kubenswrapper[4965]: I0318 11:58:03.595394 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:03 crc kubenswrapper[4965]: I0318 11:58:03.595412 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:03Z","lastTransitionTime":"2026-03-18T11:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:03 crc kubenswrapper[4965]: E0318 11:58:03.612156 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T11:58:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T11:58:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T11:58:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T11:58:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6bb909a7-2031-4da1-8950-7746d364df6b\\\",\\\"systemUUID\\\":\\\"f21bc216-4db6-44fa-8f07-5ffceb9c90c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 11:58:03 crc kubenswrapper[4965]: I0318 11:58:03.616337 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:03 crc kubenswrapper[4965]: I0318 11:58:03.616374 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:03 crc kubenswrapper[4965]: I0318 11:58:03.616388 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:03 crc kubenswrapper[4965]: I0318 11:58:03.616407 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:03 crc kubenswrapper[4965]: I0318 11:58:03.616419 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:03Z","lastTransitionTime":"2026-03-18T11:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:03 crc kubenswrapper[4965]: E0318 11:58:03.631031 4965 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T11:58:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T11:58:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T11:58:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T11:58:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6bb909a7-2031-4da1-8950-7746d364df6b\\\",\\\"systemUUID\\\":\\\"f21bc216-4db6-44fa-8f07-5ffceb9c90c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 11:58:03 crc kubenswrapper[4965]: E0318 11:58:03.631218 4965 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 11:58:03 crc kubenswrapper[4965]: E0318 11:58:03.631251 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:03 crc kubenswrapper[4965]: E0318 11:58:03.731864 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:03 crc kubenswrapper[4965]: E0318 11:58:03.832586 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:03 crc kubenswrapper[4965]: E0318 11:58:03.933318 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:03 crc kubenswrapper[4965]: I0318 11:58:03.983008 4965 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 18 11:58:04 crc kubenswrapper[4965]: E0318 11:58:04.033920 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:04 crc kubenswrapper[4965]: E0318 11:58:04.134844 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:04 crc kubenswrapper[4965]: E0318 11:58:04.235770 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:04 crc kubenswrapper[4965]: E0318 11:58:04.336867 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:04 crc kubenswrapper[4965]: E0318 11:58:04.437496 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:04 crc kubenswrapper[4965]: E0318 11:58:04.537909 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:04 crc kubenswrapper[4965]: E0318 11:58:04.639030 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:04 crc kubenswrapper[4965]: E0318 11:58:04.739384 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:04 crc kubenswrapper[4965]: E0318 11:58:04.840557 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:04 crc kubenswrapper[4965]: I0318 11:58:04.849464 4965 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 18 11:58:04 crc kubenswrapper[4965]: E0318 11:58:04.941608 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:05 crc kubenswrapper[4965]: E0318 11:58:05.042836 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:05 crc kubenswrapper[4965]: E0318 11:58:05.143019 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:05 crc kubenswrapper[4965]: E0318 11:58:05.243693 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:05 crc kubenswrapper[4965]: E0318 11:58:05.344287 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:05 crc kubenswrapper[4965]: E0318 11:58:05.444651 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:05 crc kubenswrapper[4965]: E0318 11:58:05.545690 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:05 crc kubenswrapper[4965]: E0318 11:58:05.646820 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:05 crc kubenswrapper[4965]: E0318 11:58:05.747332 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:05 crc kubenswrapper[4965]: E0318 11:58:05.848079 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:05 crc kubenswrapper[4965]: E0318 11:58:05.948264 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:06 crc kubenswrapper[4965]: E0318 11:58:06.049499 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:06 crc kubenswrapper[4965]: E0318 11:58:06.104457 4965 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 11:58:06 crc kubenswrapper[4965]: E0318 11:58:06.150390 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:06 crc kubenswrapper[4965]: E0318 11:58:06.250975 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:06 crc kubenswrapper[4965]: E0318 11:58:06.351151 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:06 crc kubenswrapper[4965]: E0318 11:58:06.451972 4965 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 11:58:06 crc kubenswrapper[4965]: I0318 11:58:06.524412 4965 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 18 11:58:06 crc kubenswrapper[4965]: I0318 11:58:06.555170 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:06 crc kubenswrapper[4965]: I0318 11:58:06.555215 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:06 crc kubenswrapper[4965]: I0318 11:58:06.555224 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:06 crc kubenswrapper[4965]: I0318 11:58:06.555237 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:06 crc kubenswrapper[4965]: I0318 11:58:06.555249 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:06Z","lastTransitionTime":"2026-03-18T11:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:06 crc kubenswrapper[4965]: I0318 11:58:06.658020 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:06 crc kubenswrapper[4965]: I0318 11:58:06.658077 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:06 crc kubenswrapper[4965]: I0318 11:58:06.658091 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:06 crc kubenswrapper[4965]: I0318 11:58:06.658109 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:06 crc kubenswrapper[4965]: I0318 11:58:06.658121 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:06Z","lastTransitionTime":"2026-03-18T11:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:06 crc kubenswrapper[4965]: I0318 11:58:06.760603 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:06 crc kubenswrapper[4965]: I0318 11:58:06.760739 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:06 crc kubenswrapper[4965]: I0318 11:58:06.760759 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:06 crc kubenswrapper[4965]: I0318 11:58:06.760784 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:06 crc kubenswrapper[4965]: I0318 11:58:06.760801 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:06Z","lastTransitionTime":"2026-03-18T11:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:06 crc kubenswrapper[4965]: I0318 11:58:06.863178 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:06 crc kubenswrapper[4965]: I0318 11:58:06.863224 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:06 crc kubenswrapper[4965]: I0318 11:58:06.863235 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:06 crc kubenswrapper[4965]: I0318 11:58:06.863251 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:06 crc kubenswrapper[4965]: I0318 11:58:06.863260 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:06Z","lastTransitionTime":"2026-03-18T11:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:06 crc kubenswrapper[4965]: I0318 11:58:06.966182 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:06 crc kubenswrapper[4965]: I0318 11:58:06.966242 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:06 crc kubenswrapper[4965]: I0318 11:58:06.966252 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:06 crc kubenswrapper[4965]: I0318 11:58:06.966272 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:06 crc kubenswrapper[4965]: I0318 11:58:06.966286 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:06Z","lastTransitionTime":"2026-03-18T11:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:06 crc kubenswrapper[4965]: I0318 11:58:06.990401 4965 apiserver.go:52] "Watching apiserver" Mar 18 11:58:06 crc kubenswrapper[4965]: I0318 11:58:06.995021 4965 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 18 11:58:06 crc kubenswrapper[4965]: I0318 11:58:06.995611 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Mar 18 11:58:06 crc kubenswrapper[4965]: I0318 11:58:06.996268 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 11:58:06 crc kubenswrapper[4965]: I0318 11:58:06.996512 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 11:58:06 crc kubenswrapper[4965]: I0318 11:58:06.996686 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 11:58:06 crc kubenswrapper[4965]: E0318 11:58:06.996764 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 11:58:06 crc kubenswrapper[4965]: I0318 11:58:06.996922 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 11:58:06 crc kubenswrapper[4965]: I0318 11:58:06.996988 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 11:58:06 crc kubenswrapper[4965]: I0318 11:58:06.997077 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 11:58:07 crc kubenswrapper[4965]: E0318 11:58:06.997694 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 11:58:07 crc kubenswrapper[4965]: E0318 11:58:06.997339 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:06.998882 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:06.999360 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:06.999416 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.000536 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.000790 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.000810 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.000899 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.001041 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.002808 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.029956 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.045764 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.059500 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.061533 4965 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.070953 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.071001 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.071013 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.071031 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.071044 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:07Z","lastTransitionTime":"2026-03-18T11:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.072304 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.072359 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.072387 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.072409 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.072432 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.072460 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.072486 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.072508 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.072532 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.072499 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.072595 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.072553 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.072832 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.072839 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.072870 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.072906 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.072940 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.072972 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.073002 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.073031 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.073062 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.073092 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.073133 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.073164 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.073194 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.073223 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.073257 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.073288 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.073317 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.073347 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.073377 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.073409 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.073440 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.073470 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.073499 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.073533 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.073564 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.073576 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.073595 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.073630 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.073667 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.073703 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.073725 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.073757 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.073788 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.073811 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.073858 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.073891 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.073924 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.073955 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.073985 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.074054 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.074084 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.074910 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.074969 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.075174 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.075201 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.075307 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.075303 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.075556 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.075855 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.075974 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.076403 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.076398 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.076942 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.076949 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.077270 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.077459 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.077506 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.078122 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.078500 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.078895 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.078926 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.079225 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.079577 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.079988 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.080170 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.080409 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.080519 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.080763 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.080822 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.081056 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.081112 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.081161 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.081262 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.081317 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.081349 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.081861 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.081995 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.082107 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.082187 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.082391 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.082626 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.083569 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.084123 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.084390 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.084453 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.084827 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.084911 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.084961 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.085010 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.085013 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.085050 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.085110 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.085155 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.085194 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.085235 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.085587 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.085639 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.085854 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.086218 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.086467 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.086498 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.086585 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.087390 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.087471 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.087512 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.087548 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.087579 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.087612 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.087626 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.087643 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.087693 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.087704 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.087723 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.087725 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.087753 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.087788 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.087890 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.087898 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.087978 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.088102 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.088145 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.088175 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.088205 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.088211 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.088230 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.088239 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.088303 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.088334 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.088535 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.088756 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.088792 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.088754 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.088753 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.088845 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.089845 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.088907 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: E0318 11:58:07.089180 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:07.588845738 +0000 UTC m=+92.575033457 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.089893 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.089918 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.089938 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.089956 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.089973 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.090063 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.090093 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.090118 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.090138 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.090157 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.090177 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.090195 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.090219 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.090227 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.090243 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.090286 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.090316 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.090342 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.090370 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.090396 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.090421 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.090447 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.090473 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.090498 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.090520 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.090542 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.090566 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.090595 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.090622 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.090694 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.090727 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.090755 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.090781 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.090812 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.090838 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.090866 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.090893 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.090918 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.090945 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.090972 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091000 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091029 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091058 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091096 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091122 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091147 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091176 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091202 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091226 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091255 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091279 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091307 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091332 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091355 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091374 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091393 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091412 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091430 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091448 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091475 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091494 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091514 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091533 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091551 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091568 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091586 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091606 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091626 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091645 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091668 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091701 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091719 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091739 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091759 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091779 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091799 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091819 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091839 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091858 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091877 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091896 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091912 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091932 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091950 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091968 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091985 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.092004 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.092020 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.092040 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.092063 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.092087 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.092112 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.092130 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.092167 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.092195 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.092219 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.092248 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.092290 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.092310 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.092331 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.092349 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.092371 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.092393 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.092412 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.092430 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.092448 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.092465 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.092483 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.092500 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.092517 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.092534 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.092550 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.092592 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.092617 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.092638 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.092663 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.092706 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.092725 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.092744 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.092762 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.092782 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.092799 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093221 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093246 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093264 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093288 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093387 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093405 4965 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093420 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093433 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093445 4965 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093458 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093471 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093485 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093498 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093510 4965 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093522 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093534 4965 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093547 4965 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093560 4965 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093572 4965 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093585 4965 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093601 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093612 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093621 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093632 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093641 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093652 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093664 4965 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093674 4965 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093704 4965 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093717 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093730 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093743 4965 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093754 4965 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093765 4965 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093778 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093791 4965 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093805 4965 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093817 4965 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093829 4965 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093839 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093848 4965 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093857 4965 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093867 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093876 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093886 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093895 4965 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093904 4965 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093914 4965 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093924 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093933 4965 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093942 4965 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093951 4965 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093960 4965 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093970 4965 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093980 4965 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.093990 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.094001 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.094012 4965 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.094027 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.094041 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.094054 4965 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.094067 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.094080 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.094095 4965 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.094106 4965 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.094120 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.099520 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.100170 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.101664 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.102615 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.103295 4965 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.090252 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.114747 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.090429 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.089272 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.089276 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.089701 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.090667 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.090720 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.090954 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.091783 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.089208 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.099161 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.099174 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.099481 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: E0318 11:58:07.099707 4965 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 11:58:07 crc kubenswrapper[4965]: E0318 11:58:07.115083 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 11:58:07.6150532 +0000 UTC m=+92.601240689 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.115442 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.115679 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.099741 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.099795 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.099915 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.100714 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.100769 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.101404 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.116373 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.101863 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.102395 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.102982 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: E0318 11:58:07.103107 4965 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.116717 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.116828 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: E0318 11:58:07.116838 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 11:58:07.616806918 +0000 UTC m=+92.602994647 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.117201 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.103250 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.103605 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.107311 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.117255 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.108008 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.108103 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.108624 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.109395 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.111129 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.103438 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.113356 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.110665 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.114176 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.114650 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.113730 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.114606 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.118457 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.118520 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.118566 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.118586 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.118885 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.119007 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.119156 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.119586 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.119630 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.119874 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.119981 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.120324 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: E0318 11:58:07.121347 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 11:58:07 crc kubenswrapper[4965]: E0318 11:58:07.121415 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 11:58:07 crc kubenswrapper[4965]: E0318 11:58:07.121440 4965 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 11:58:07 crc kubenswrapper[4965]: E0318 11:58:07.121586 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 11:58:07.621532478 +0000 UTC m=+92.607720177 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.122293 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: E0318 11:58:07.122680 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 11:58:07 crc kubenswrapper[4965]: E0318 11:58:07.122706 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 11:58:07 crc kubenswrapper[4965]: E0318 11:58:07.122721 4965 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 11:58:07 crc kubenswrapper[4965]: E0318 11:58:07.122773 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 11:58:07.622757612 +0000 UTC m=+92.608945081 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.123871 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.124553 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.124801 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.125366 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.126018 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.126206 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.126799 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.127069 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.127112 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.127314 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.127306 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.127570 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.128174 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.128249 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.128660 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.128727 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.134960 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.135926 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.136103 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.136122 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.136457 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.138331 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.138782 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.138827 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.138964 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.139044 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.139796 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.139883 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.139949 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.140561 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.140908 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.141006 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.141111 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.141496 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.141532 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.141834 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.142151 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.142176 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.142221 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.142247 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.142850 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.142852 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.143075 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.143600 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.143858 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.144011 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.144021 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.144367 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.144962 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.145096 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.145170 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.145584 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.145636 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.145774 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.145978 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.146027 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.146430 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.146523 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.146638 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.147122 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.147151 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.147461 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.147636 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.148193 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.148622 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.148639 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.148705 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.148969 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.149063 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.149095 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.149021 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.149107 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.149784 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.150347 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.151236 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.155730 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.170097 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.171086 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.175679 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.175720 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.175732 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.175750 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.175773 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:07Z","lastTransitionTime":"2026-03-18T11:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.178925 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195134 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195182 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195236 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195263 4965 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195277 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195293 4965 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195305 4965 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195320 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195332 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195344 4965 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195342 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195360 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195356 4965 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195427 4965 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195438 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195449 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195458 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195468 4965 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195478 4965 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195488 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195497 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195507 4965 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195516 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195525 4965 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195537 4965 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195549 4965 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195561 4965 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195572 4965 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195582 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195591 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195600 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195609 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195618 4965 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195629 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195641 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195650 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195663 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195689 4965 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195701 4965 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195713 4965 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195722 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195731 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195740 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195749 4965 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195758 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195776 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195786 4965 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195795 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195806 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195815 4965 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195825 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195834 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195844 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195853 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195864 4965 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195873 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195882 4965 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195891 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195900 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195909 4965 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195918 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195927 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195936 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195946 4965 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195955 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195967 4965 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195977 4965 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195985 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.195994 4965 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196002 4965 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196014 4965 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196023 4965 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196032 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196041 4965 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196050 4965 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196060 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196070 4965 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196080 4965 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196090 4965 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196099 4965 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196108 4965 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196119 4965 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196128 4965 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196139 4965 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196148 4965 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196160 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196169 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196177 4965 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196186 4965 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196195 4965 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196203 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196213 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196222 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196231 4965 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196240 4965 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196249 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196258 4965 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196267 4965 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196275 4965 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196284 4965 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196293 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196303 4965 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196312 4965 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196320 4965 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196329 4965 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196340 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196348 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196356 4965 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196365 4965 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196377 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196386 4965 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196394 4965 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196403 4965 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196412 4965 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196421 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196429 4965 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196437 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196447 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196456 4965 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196467 4965 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196476 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196485 4965 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196493 4965 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196502 4965 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196511 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196521 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196529 4965 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196537 4965 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196546 4965 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196555 4965 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196564 4965 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196573 4965 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196582 4965 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196591 4965 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196599 4965 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.196611 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.278511 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.278590 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.278638 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.278710 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.278735 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:07Z","lastTransitionTime":"2026-03-18T11:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.322132 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.332751 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.339042 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.380956 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.380995 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.381007 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.381022 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.381034 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:07Z","lastTransitionTime":"2026-03-18T11:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.391949 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"21a56241c4c974e921308772faf61b7be6bb90b60364e4e32df4fcac67c77744"} Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.392828 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"f55defb56e042003cad55bd3b5beaadf591a4303577a2087d97c0da00e0a88e9"} Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.393597 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"2bbc8b736a2b3065524a6926537fa24739a1b36a5c146eb2d69c828414c0f6d2"} Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.483375 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.483427 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.483439 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.483456 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.483470 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:07Z","lastTransitionTime":"2026-03-18T11:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.586215 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.586496 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.586506 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.586521 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.586531 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:07Z","lastTransitionTime":"2026-03-18T11:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.599927 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:07 crc kubenswrapper[4965]: E0318 11:58:07.600112 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:08.600093644 +0000 UTC m=+93.586281143 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.689450 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.689491 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.689502 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.689518 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.689528 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:07Z","lastTransitionTime":"2026-03-18T11:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.701201 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.701256 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.701315 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.701338 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 11:58:07 crc kubenswrapper[4965]: E0318 11:58:07.701415 4965 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 11:58:07 crc kubenswrapper[4965]: E0318 11:58:07.701411 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 11:58:07 crc kubenswrapper[4965]: E0318 11:58:07.701457 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 11:58:07 crc kubenswrapper[4965]: E0318 11:58:07.701468 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 11:58:08.701451547 +0000 UTC m=+93.687639036 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 11:58:07 crc kubenswrapper[4965]: E0318 11:58:07.701473 4965 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 11:58:07 crc kubenswrapper[4965]: E0318 11:58:07.701484 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 11:58:07 crc kubenswrapper[4965]: E0318 11:58:07.701501 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 11:58:07 crc kubenswrapper[4965]: E0318 11:58:07.701512 4965 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 11:58:07 crc kubenswrapper[4965]: E0318 11:58:07.701538 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 11:58:08.701515549 +0000 UTC m=+93.687703038 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 11:58:07 crc kubenswrapper[4965]: E0318 11:58:07.701430 4965 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 11:58:07 crc kubenswrapper[4965]: E0318 11:58:07.701560 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 11:58:08.70154905 +0000 UTC m=+93.687736539 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 11:58:07 crc kubenswrapper[4965]: E0318 11:58:07.701590 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 11:58:08.701576861 +0000 UTC m=+93.687764410 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.792197 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.792257 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.792270 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.792287 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.792300 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:07Z","lastTransitionTime":"2026-03-18T11:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.894483 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.894552 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.894569 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.894594 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.894612 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:07Z","lastTransitionTime":"2026-03-18T11:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.997330 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.997382 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.997395 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.997411 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:07 crc kubenswrapper[4965]: I0318 11:58:07.997428 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:07Z","lastTransitionTime":"2026-03-18T11:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.027443 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.028118 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.029638 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.030405 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.031588 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.032259 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.033031 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.034175 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.034976 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.036868 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.036987 4965 scope.go:117] "RemoveContainer" containerID="733d0d5c89687c2a3f5a4c5bed039b30b31430fe8959663bb82b04e5c4b6eb76" Mar 18 11:58:08 crc kubenswrapper[4965]: E0318 11:58:08.037159 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.037512 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.038905 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.039563 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.040235 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.041508 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.042213 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.043359 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.043872 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.044583 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.045990 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.046593 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.047803 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.048337 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.049590 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.050166 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.050932 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.052334 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.052831 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.053376 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.053878 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.054312 4965 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.054406 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.055755 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.056229 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.056599 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.057731 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.058328 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.058909 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.059561 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.060194 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.063210 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.063950 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.064897 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.065458 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.066423 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.067210 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.068176 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.069197 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.071027 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.071754 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.072385 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.073593 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.074368 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.075497 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.076138 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.099648 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.099697 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.099708 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.099726 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.099736 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:08Z","lastTransitionTime":"2026-03-18T11:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.201609 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.201708 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.201732 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.201760 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.201780 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:08Z","lastTransitionTime":"2026-03-18T11:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.304863 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.304925 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.304945 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.304969 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.304986 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:08Z","lastTransitionTime":"2026-03-18T11:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.397401 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d2f1536afea5ad7018511508fda82c958a1fa56213778e2bab8e6850353fb136"} Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.397476 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"0d18e5cc0161ebd5fa957725fa886db617d0ba6ad6a2df97278f5e9a32ea5321"} Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.399438 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"72d3974cb83897715187383f582a02d45400f918091d0f7a0c1f1a56dae1998b"} Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.400140 4965 scope.go:117] "RemoveContainer" containerID="733d0d5c89687c2a3f5a4c5bed039b30b31430fe8959663bb82b04e5c4b6eb76" Mar 18 11:58:08 crc kubenswrapper[4965]: E0318 11:58:08.400361 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.409212 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.409249 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.409260 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.409586 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.409622 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:08Z","lastTransitionTime":"2026-03-18T11:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.418042 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2f1536afea5ad7018511508fda82c958a1fa56213778e2bab8e6850353fb136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T11:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d18e5cc0161ebd5fa957725fa886db617d0ba6ad6a2df97278f5e9a32ea5321\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T11:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:58:08Z is after 2025-08-24T17:21:41Z" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.430673 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:58:08Z is after 2025-08-24T17:21:41Z" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.446738 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:58:08Z is after 2025-08-24T17:21:41Z" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.466117 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:58:08Z is after 2025-08-24T17:21:41Z" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.483440 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:58:08Z is after 2025-08-24T17:21:41Z" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.509854 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fccb5ad-ea8f-4354-82ce-8fc86b6cdfbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T11:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T11:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T11:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T11:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T11:56:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1489cfbddc1072ee2238c82e981b8dd676b935c235de66cb53613a2479018238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T11:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35983903bdfe8136c0d115201f4bfb23009d1cb8019ea7f9b647614eb8b27afe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T11:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90185fb5121b4cb1ffd94f24cce668838aa69c89a757de56246a7e9e3254005a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T11:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://733d0d5c89687c2a3f5a4c5bed039b30b31430fe8959663bb82b04e5c4b6eb76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733d0d5c89687c2a3f5a4c5bed039b30b31430fe8959663bb82b04e5c4b6eb76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T11:57:41Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 11:57:41.646182 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 11:57:41.646371 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 11:57:41.647166 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693070532/tls.crt::/tmp/serving-cert-693070532/tls.key\\\\\\\"\\\\nI0318 11:57:41.814413 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 11:57:41.816940 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 11:57:41.816955 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 11:57:41.816974 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 11:57:41.816979 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 11:57:41.826560 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 11:57:41.826609 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 11:57:41.826620 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 11:57:41.826629 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 11:57:41.826636 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 11:57:41.826642 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 11:57:41.826648 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 11:57:41.826955 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 11:57:41.829626 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T11:57:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6be9268fda0866f34859b014813011535f476e4a3004010d537f23df27440f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T11:56:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8686a109c054bacaff991779f6488d86d2eecb88f13b5e9e83e1680f4022af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8686a109c054bacaff991779f6488d86d2eecb88f13b5e9e83e1680f4022af83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T11:56:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T11:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T11:56:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:58:08Z is after 2025-08-24T17:21:41Z" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.511874 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.511930 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.511943 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.511958 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.511969 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:08Z","lastTransitionTime":"2026-03-18T11:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.535966 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:58:08Z is after 2025-08-24T17:21:41Z" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.558476 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fccb5ad-ea8f-4354-82ce-8fc86b6cdfbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T11:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T11:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T11:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T11:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T11:56:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1489cfbddc1072ee2238c82e981b8dd676b935c235de66cb53613a2479018238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T11:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35983903bdfe8136c0d115201f4bfb23009d1cb8019ea7f9b647614eb8b27afe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T11:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90185fb5121b4cb1ffd94f24cce668838aa69c89a757de56246a7e9e3254005a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T11:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://733d0d5c89687c2a3f5a4c5bed039b30b31430fe8959663bb82b04e5c4b6eb76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://733d0d5c89687c2a3f5a4c5bed039b30b31430fe8959663bb82b04e5c4b6eb76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T11:57:41Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 11:57:41.646182 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 11:57:41.646371 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 11:57:41.647166 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-693070532/tls.crt::/tmp/serving-cert-693070532/tls.key\\\\\\\"\\\\nI0318 11:57:41.814413 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 11:57:41.816940 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 11:57:41.816955 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 11:57:41.816974 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 11:57:41.816979 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 11:57:41.826560 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 11:57:41.826609 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 11:57:41.826620 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 11:57:41.826629 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 11:57:41.826636 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 11:57:41.826642 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 11:57:41.826648 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 11:57:41.826955 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 11:57:41.829626 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T11:57:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6be9268fda0866f34859b014813011535f476e4a3004010d537f23df27440f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T11:56:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8686a109c054bacaff991779f6488d86d2eecb88f13b5e9e83e1680f4022af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8686a109c054bacaff991779f6488d86d2eecb88f13b5e9e83e1680f4022af83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T11:56:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T11:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T11:56:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:58:08Z is after 2025-08-24T17:21:41Z" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.572752 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72d3974cb83897715187383f582a02d45400f918091d0f7a0c1f1a56dae1998b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T11:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:58:08Z is after 2025-08-24T17:21:41Z" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.586471 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:58:08Z is after 2025-08-24T17:21:41Z" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.599538 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2f1536afea5ad7018511508fda82c958a1fa56213778e2bab8e6850353fb136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T11:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d18e5cc0161ebd5fa957725fa886db617d0ba6ad6a2df97278f5e9a32ea5321\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T11:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:58:08Z is after 2025-08-24T17:21:41Z" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.609291 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:08 crc kubenswrapper[4965]: E0318 11:58:08.609557 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:10.609499087 +0000 UTC m=+95.595686606 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.614089 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.614128 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.614136 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.614152 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.614161 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:08Z","lastTransitionTime":"2026-03-18T11:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.615923 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:58:08Z is after 2025-08-24T17:21:41Z" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.629156 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:58:08Z is after 2025-08-24T17:21:41Z" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.640523 4965 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T11:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T11:58:08Z is after 2025-08-24T17:21:41Z" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.710653 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.710725 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.710754 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.710882 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 11:58:08 crc kubenswrapper[4965]: E0318 11:58:08.711007 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 11:58:08 crc kubenswrapper[4965]: E0318 11:58:08.711027 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 11:58:08 crc kubenswrapper[4965]: E0318 11:58:08.711038 4965 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 11:58:08 crc kubenswrapper[4965]: E0318 11:58:08.711038 4965 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 11:58:08 crc kubenswrapper[4965]: E0318 11:58:08.711084 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 11:58:10.711069916 +0000 UTC m=+95.697257395 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 11:58:08 crc kubenswrapper[4965]: E0318 11:58:08.711127 4965 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 11:58:08 crc kubenswrapper[4965]: E0318 11:58:08.711137 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 11:58:10.711115857 +0000 UTC m=+95.697303386 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 11:58:08 crc kubenswrapper[4965]: E0318 11:58:08.711162 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 11:58:10.711152958 +0000 UTC m=+95.697340557 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 11:58:08 crc kubenswrapper[4965]: E0318 11:58:08.711238 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 11:58:08 crc kubenswrapper[4965]: E0318 11:58:08.711256 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 11:58:08 crc kubenswrapper[4965]: E0318 11:58:08.711269 4965 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 11:58:08 crc kubenswrapper[4965]: E0318 11:58:08.711304 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 11:58:10.711294482 +0000 UTC m=+95.697482041 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.720760 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.720809 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.720822 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.720838 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.720850 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:08Z","lastTransitionTime":"2026-03-18T11:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.822809 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.822846 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.822858 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.822872 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.822883 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:08Z","lastTransitionTime":"2026-03-18T11:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.925231 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.925259 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.925268 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.925279 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:08 crc kubenswrapper[4965]: I0318 11:58:08.925287 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:08Z","lastTransitionTime":"2026-03-18T11:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.020307 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 11:58:09 crc kubenswrapper[4965]: E0318 11:58:09.020410 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.020307 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 11:58:09 crc kubenswrapper[4965]: E0318 11:58:09.020466 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.020306 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 11:58:09 crc kubenswrapper[4965]: E0318 11:58:09.020508 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.026927 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.026948 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.026957 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.026968 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.026976 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:09Z","lastTransitionTime":"2026-03-18T11:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.129299 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.129370 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.129387 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.129870 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.129930 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:09Z","lastTransitionTime":"2026-03-18T11:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.232460 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.232503 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.232521 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.232544 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.232560 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:09Z","lastTransitionTime":"2026-03-18T11:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.335907 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.335960 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.335977 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.336002 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.336020 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:09Z","lastTransitionTime":"2026-03-18T11:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.439156 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.439226 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.439245 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.439270 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.439288 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:09Z","lastTransitionTime":"2026-03-18T11:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.541811 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.541846 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.541857 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.541875 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.541886 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:09Z","lastTransitionTime":"2026-03-18T11:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.644579 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.644644 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.644699 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.644730 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.644749 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:09Z","lastTransitionTime":"2026-03-18T11:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.747211 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.747273 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.747291 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.747317 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.747333 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:09Z","lastTransitionTime":"2026-03-18T11:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.850454 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.850501 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.850521 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.850547 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.850567 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:09Z","lastTransitionTime":"2026-03-18T11:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.953203 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.953261 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.953277 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.953299 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:09 crc kubenswrapper[4965]: I0318 11:58:09.953315 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:09Z","lastTransitionTime":"2026-03-18T11:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.055712 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.055776 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.055787 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.055804 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.055819 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:10Z","lastTransitionTime":"2026-03-18T11:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.158301 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.158356 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.158369 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.158388 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.158400 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:10Z","lastTransitionTime":"2026-03-18T11:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.261972 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.262025 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.262037 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.262059 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.262074 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:10Z","lastTransitionTime":"2026-03-18T11:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.364153 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.364212 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.364230 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.364253 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.364270 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:10Z","lastTransitionTime":"2026-03-18T11:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.404809 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"1807c6f7ab2b3e667f5e087effe13f243891b70d6b612e1edef4ddff0dc8cb65"} Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.466630 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.466694 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.466710 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.466727 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.466740 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:10Z","lastTransitionTime":"2026-03-18T11:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.568669 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.568712 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.568723 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.568741 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.568755 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:10Z","lastTransitionTime":"2026-03-18T11:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.628989 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:10 crc kubenswrapper[4965]: E0318 11:58:10.629179 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:14.629159805 +0000 UTC m=+99.615347304 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.673600 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.673653 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.673684 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.673701 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.673715 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:10Z","lastTransitionTime":"2026-03-18T11:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.730432 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.730492 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.730516 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.730555 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 11:58:10 crc kubenswrapper[4965]: E0318 11:58:10.730701 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 11:58:10 crc kubenswrapper[4965]: E0318 11:58:10.730717 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 11:58:10 crc kubenswrapper[4965]: E0318 11:58:10.730726 4965 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 11:58:10 crc kubenswrapper[4965]: E0318 11:58:10.730787 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 11:58:14.730774285 +0000 UTC m=+99.716961764 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 11:58:10 crc kubenswrapper[4965]: E0318 11:58:10.730855 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 11:58:10 crc kubenswrapper[4965]: E0318 11:58:10.730866 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 11:58:10 crc kubenswrapper[4965]: E0318 11:58:10.730874 4965 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 11:58:10 crc kubenswrapper[4965]: E0318 11:58:10.730894 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 11:58:14.730888048 +0000 UTC m=+99.717075527 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 11:58:10 crc kubenswrapper[4965]: E0318 11:58:10.730954 4965 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 11:58:10 crc kubenswrapper[4965]: E0318 11:58:10.731322 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 11:58:14.73131242 +0000 UTC m=+99.717499899 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 11:58:10 crc kubenswrapper[4965]: E0318 11:58:10.731460 4965 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 11:58:10 crc kubenswrapper[4965]: E0318 11:58:10.731644 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 11:58:14.731618328 +0000 UTC m=+99.717805817 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.776714 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.776779 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.776803 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.776833 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.776855 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:10Z","lastTransitionTime":"2026-03-18T11:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.879525 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.879567 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.879583 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.879606 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.879623 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:10Z","lastTransitionTime":"2026-03-18T11:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.982495 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.982801 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.982871 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.982952 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:10 crc kubenswrapper[4965]: I0318 11:58:10.983010 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:10Z","lastTransitionTime":"2026-03-18T11:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.020078 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.020103 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 11:58:11 crc kubenswrapper[4965]: E0318 11:58:11.020188 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.020227 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 11:58:11 crc kubenswrapper[4965]: E0318 11:58:11.020328 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 11:58:11 crc kubenswrapper[4965]: E0318 11:58:11.020449 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.086188 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.086229 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.086240 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.086255 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.086266 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:11Z","lastTransitionTime":"2026-03-18T11:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.189230 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.189253 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.189261 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.189458 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.189467 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:11Z","lastTransitionTime":"2026-03-18T11:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.249926 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-2jks9"] Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.250275 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2jks9" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.252841 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.252861 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.253519 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.270554 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-8l67n"] Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.271110 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-8l67n" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.272850 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-mh2cx"] Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.273882 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mh2cx" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.275870 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.276436 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.276452 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.277069 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.277836 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.278313 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-627t7"] Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.278791 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.283873 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.284073 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.284083 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.284184 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.284301 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.285004 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.286049 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.291336 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.291370 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.291382 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.291400 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.291413 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:11Z","lastTransitionTime":"2026-03-18T11:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.301275 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ts942"] Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.302140 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.303794 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.304796 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.304828 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.304879 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.305027 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.305036 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.306402 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.336804 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfldv\" (UniqueName: \"kubernetes.io/projected/8b0a26bc-e371-4829-9e6f-95e93b1633e7-kube-api-access-vfldv\") pod \"multus-additional-cni-plugins-mh2cx\" (UID: \"8b0a26bc-e371-4829-9e6f-95e93b1633e7\") " pod="openshift-multus/multus-additional-cni-plugins-mh2cx" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.336848 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-host-var-lib-cni-multus\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.336872 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-hostroot\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.336891 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-multus-conf-dir\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.336927 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8b0a26bc-e371-4829-9e6f-95e93b1633e7-system-cni-dir\") pod \"multus-additional-cni-plugins-mh2cx\" (UID: \"8b0a26bc-e371-4829-9e6f-95e93b1633e7\") " pod="openshift-multus/multus-additional-cni-plugins-mh2cx" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.336999 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8b0a26bc-e371-4829-9e6f-95e93b1633e7-cnibin\") pod \"multus-additional-cni-plugins-mh2cx\" (UID: \"8b0a26bc-e371-4829-9e6f-95e93b1633e7\") " pod="openshift-multus/multus-additional-cni-plugins-mh2cx" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.337078 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e9a53215-1d0d-47de-92c4-cea3209fe4fa-proxy-tls\") pod \"machine-config-daemon-8l67n\" (UID: \"e9a53215-1d0d-47de-92c4-cea3209fe4fa\") " pod="openshift-machine-config-operator/machine-config-daemon-8l67n" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.337225 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/21343843-2fbe-463b-b8a0-1efcaa504902-hosts-file\") pod \"node-resolver-2jks9\" (UID: \"21343843-2fbe-463b-b8a0-1efcaa504902\") " pod="openshift-dns/node-resolver-2jks9" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.337263 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpsw5\" (UniqueName: \"kubernetes.io/projected/21343843-2fbe-463b-b8a0-1efcaa504902-kube-api-access-lpsw5\") pod \"node-resolver-2jks9\" (UID: \"21343843-2fbe-463b-b8a0-1efcaa504902\") " pod="openshift-dns/node-resolver-2jks9" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.337306 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8b0a26bc-e371-4829-9e6f-95e93b1633e7-os-release\") pod \"multus-additional-cni-plugins-mh2cx\" (UID: \"8b0a26bc-e371-4829-9e6f-95e93b1633e7\") " pod="openshift-multus/multus-additional-cni-plugins-mh2cx" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.337336 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nghs7\" (UniqueName: \"kubernetes.io/projected/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-kube-api-access-nghs7\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.337374 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w7bc\" (UniqueName: \"kubernetes.io/projected/e9a53215-1d0d-47de-92c4-cea3209fe4fa-kube-api-access-2w7bc\") pod \"machine-config-daemon-8l67n\" (UID: \"e9a53215-1d0d-47de-92c4-cea3209fe4fa\") " pod="openshift-machine-config-operator/machine-config-daemon-8l67n" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.337418 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-host-run-netns\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.337469 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8b0a26bc-e371-4829-9e6f-95e93b1633e7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mh2cx\" (UID: \"8b0a26bc-e371-4829-9e6f-95e93b1633e7\") " pod="openshift-multus/multus-additional-cni-plugins-mh2cx" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.337509 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-multus-cni-dir\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.337573 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-cnibin\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.337618 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-host-var-lib-kubelet\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.337678 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-multus-daemon-config\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.337717 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e9a53215-1d0d-47de-92c4-cea3209fe4fa-rootfs\") pod \"machine-config-daemon-8l67n\" (UID: \"e9a53215-1d0d-47de-92c4-cea3209fe4fa\") " pod="openshift-machine-config-operator/machine-config-daemon-8l67n" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.337739 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e9a53215-1d0d-47de-92c4-cea3209fe4fa-mcd-auth-proxy-config\") pod \"machine-config-daemon-8l67n\" (UID: \"e9a53215-1d0d-47de-92c4-cea3209fe4fa\") " pod="openshift-machine-config-operator/machine-config-daemon-8l67n" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.337767 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-cni-binary-copy\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.337787 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-multus-socket-dir-parent\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.337808 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-os-release\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.337828 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-host-var-lib-cni-bin\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.337849 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-etc-kubernetes\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.337880 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-system-cni-dir\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.337905 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-host-run-multus-certs\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.337925 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8b0a26bc-e371-4829-9e6f-95e93b1633e7-cni-binary-copy\") pod \"multus-additional-cni-plugins-mh2cx\" (UID: \"8b0a26bc-e371-4829-9e6f-95e93b1633e7\") " pod="openshift-multus/multus-additional-cni-plugins-mh2cx" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.337958 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8b0a26bc-e371-4829-9e6f-95e93b1633e7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mh2cx\" (UID: \"8b0a26bc-e371-4829-9e6f-95e93b1633e7\") " pod="openshift-multus/multus-additional-cni-plugins-mh2cx" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.337976 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-host-run-k8s-cni-cncf-io\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.393487 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.393523 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.393533 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.393548 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.393559 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:11Z","lastTransitionTime":"2026-03-18T11:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.439061 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e9a53215-1d0d-47de-92c4-cea3209fe4fa-rootfs\") pod \"machine-config-daemon-8l67n\" (UID: \"e9a53215-1d0d-47de-92c4-cea3209fe4fa\") " pod="openshift-machine-config-operator/machine-config-daemon-8l67n" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.439107 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e9a53215-1d0d-47de-92c4-cea3209fe4fa-mcd-auth-proxy-config\") pod \"machine-config-daemon-8l67n\" (UID: \"e9a53215-1d0d-47de-92c4-cea3209fe4fa\") " pod="openshift-machine-config-operator/machine-config-daemon-8l67n" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.439128 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-cni-binary-copy\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.439166 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-multus-socket-dir-parent\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.439195 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/40a065b9-60d2-47fd-8d74-78d53ae612a9-env-overrides\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.439212 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-os-release\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.439230 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-host-var-lib-cni-bin\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.439226 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e9a53215-1d0d-47de-92c4-cea3209fe4fa-rootfs\") pod \"machine-config-daemon-8l67n\" (UID: \"e9a53215-1d0d-47de-92c4-cea3209fe4fa\") " pod="openshift-machine-config-operator/machine-config-daemon-8l67n" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.439245 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-etc-kubernetes\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.439275 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-etc-kubernetes\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.439315 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-host-run-ovn-kubernetes\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.439377 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-os-release\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.439320 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-host-var-lib-cni-bin\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.439421 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-multus-socket-dir-parent\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.439416 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-host-cni-bin\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.439473 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/40a065b9-60d2-47fd-8d74-78d53ae612a9-ovn-node-metrics-cert\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.439500 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-system-cni-dir\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.439544 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-host-run-multus-certs\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.439588 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-host-run-multus-certs\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.439635 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-system-cni-dir\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.439648 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8b0a26bc-e371-4829-9e6f-95e93b1633e7-cni-binary-copy\") pod \"multus-additional-cni-plugins-mh2cx\" (UID: \"8b0a26bc-e371-4829-9e6f-95e93b1633e7\") " pod="openshift-multus/multus-additional-cni-plugins-mh2cx" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.439694 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-run-openvswitch\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.439730 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8b0a26bc-e371-4829-9e6f-95e93b1633e7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mh2cx\" (UID: \"8b0a26bc-e371-4829-9e6f-95e93b1633e7\") " pod="openshift-multus/multus-additional-cni-plugins-mh2cx" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.439751 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-host-run-k8s-cni-cncf-io\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.439785 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-etc-openvswitch\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.439808 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-run-ovn\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.439821 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-host-run-k8s-cni-cncf-io\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.439868 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.439886 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/40a065b9-60d2-47fd-8d74-78d53ae612a9-ovnkube-script-lib\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.439916 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfldv\" (UniqueName: \"kubernetes.io/projected/8b0a26bc-e371-4829-9e6f-95e93b1633e7-kube-api-access-vfldv\") pod \"multus-additional-cni-plugins-mh2cx\" (UID: \"8b0a26bc-e371-4829-9e6f-95e93b1633e7\") " pod="openshift-multus/multus-additional-cni-plugins-mh2cx" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.439964 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-host-var-lib-cni-multus\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.439980 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e9a53215-1d0d-47de-92c4-cea3209fe4fa-mcd-auth-proxy-config\") pod \"machine-config-daemon-8l67n\" (UID: \"e9a53215-1d0d-47de-92c4-cea3209fe4fa\") " pod="openshift-machine-config-operator/machine-config-daemon-8l67n" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.439999 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-hostroot\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.439980 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-cni-binary-copy\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.440017 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-host-var-lib-cni-multus\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.440045 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-hostroot\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.440439 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-multus-conf-dir\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.440496 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-multus-conf-dir\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.440551 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-host-run-netns\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.440650 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8b0a26bc-e371-4829-9e6f-95e93b1633e7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mh2cx\" (UID: \"8b0a26bc-e371-4829-9e6f-95e93b1633e7\") " pod="openshift-multus/multus-additional-cni-plugins-mh2cx" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.440738 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8b0a26bc-e371-4829-9e6f-95e93b1633e7-system-cni-dir\") pod \"multus-additional-cni-plugins-mh2cx\" (UID: \"8b0a26bc-e371-4829-9e6f-95e93b1633e7\") " pod="openshift-multus/multus-additional-cni-plugins-mh2cx" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.440756 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8b0a26bc-e371-4829-9e6f-95e93b1633e7-cni-binary-copy\") pod \"multus-additional-cni-plugins-mh2cx\" (UID: \"8b0a26bc-e371-4829-9e6f-95e93b1633e7\") " pod="openshift-multus/multus-additional-cni-plugins-mh2cx" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.440778 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8b0a26bc-e371-4829-9e6f-95e93b1633e7-cnibin\") pod \"multus-additional-cni-plugins-mh2cx\" (UID: \"8b0a26bc-e371-4829-9e6f-95e93b1633e7\") " pod="openshift-multus/multus-additional-cni-plugins-mh2cx" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.440808 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e9a53215-1d0d-47de-92c4-cea3209fe4fa-proxy-tls\") pod \"machine-config-daemon-8l67n\" (UID: \"e9a53215-1d0d-47de-92c4-cea3209fe4fa\") " pod="openshift-machine-config-operator/machine-config-daemon-8l67n" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.440833 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-host-slash\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.440850 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-var-lib-openvswitch\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.440852 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8b0a26bc-e371-4829-9e6f-95e93b1633e7-system-cni-dir\") pod \"multus-additional-cni-plugins-mh2cx\" (UID: \"8b0a26bc-e371-4829-9e6f-95e93b1633e7\") " pod="openshift-multus/multus-additional-cni-plugins-mh2cx" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.440873 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-log-socket\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.440915 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/21343843-2fbe-463b-b8a0-1efcaa504902-hosts-file\") pod \"node-resolver-2jks9\" (UID: \"21343843-2fbe-463b-b8a0-1efcaa504902\") " pod="openshift-dns/node-resolver-2jks9" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.440938 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpsw5\" (UniqueName: \"kubernetes.io/projected/21343843-2fbe-463b-b8a0-1efcaa504902-kube-api-access-lpsw5\") pod \"node-resolver-2jks9\" (UID: \"21343843-2fbe-463b-b8a0-1efcaa504902\") " pod="openshift-dns/node-resolver-2jks9" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.440965 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-systemd-units\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.440986 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-run-systemd\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.441003 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/21343843-2fbe-463b-b8a0-1efcaa504902-hosts-file\") pod \"node-resolver-2jks9\" (UID: \"21343843-2fbe-463b-b8a0-1efcaa504902\") " pod="openshift-dns/node-resolver-2jks9" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.441009 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-host-cni-netd\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.441035 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8b0a26bc-e371-4829-9e6f-95e93b1633e7-os-release\") pod \"multus-additional-cni-plugins-mh2cx\" (UID: \"8b0a26bc-e371-4829-9e6f-95e93b1633e7\") " pod="openshift-multus/multus-additional-cni-plugins-mh2cx" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.441061 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nghs7\" (UniqueName: \"kubernetes.io/projected/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-kube-api-access-nghs7\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.441088 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w7bc\" (UniqueName: \"kubernetes.io/projected/e9a53215-1d0d-47de-92c4-cea3209fe4fa-kube-api-access-2w7bc\") pod \"machine-config-daemon-8l67n\" (UID: \"e9a53215-1d0d-47de-92c4-cea3209fe4fa\") " pod="openshift-machine-config-operator/machine-config-daemon-8l67n" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.441113 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4vbd\" (UniqueName: \"kubernetes.io/projected/40a065b9-60d2-47fd-8d74-78d53ae612a9-kube-api-access-v4vbd\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.441138 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-host-run-netns\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.441160 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-node-log\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.441201 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8b0a26bc-e371-4829-9e6f-95e93b1633e7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mh2cx\" (UID: \"8b0a26bc-e371-4829-9e6f-95e93b1633e7\") " pod="openshift-multus/multus-additional-cni-plugins-mh2cx" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.441222 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-host-kubelet\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.441250 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-multus-cni-dir\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.441273 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-cnibin\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.441300 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-host-var-lib-kubelet\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.440937 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8b0a26bc-e371-4829-9e6f-95e93b1633e7-cnibin\") pod \"multus-additional-cni-plugins-mh2cx\" (UID: \"8b0a26bc-e371-4829-9e6f-95e93b1633e7\") " pod="openshift-multus/multus-additional-cni-plugins-mh2cx" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.441318 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-multus-daemon-config\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.441372 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/40a065b9-60d2-47fd-8d74-78d53ae612a9-ovnkube-config\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.441457 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-host-run-netns\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.441520 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8b0a26bc-e371-4829-9e6f-95e93b1633e7-os-release\") pod \"multus-additional-cni-plugins-mh2cx\" (UID: \"8b0a26bc-e371-4829-9e6f-95e93b1633e7\") " pod="openshift-multus/multus-additional-cni-plugins-mh2cx" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.441854 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-multus-daemon-config\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.442092 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-multus-cni-dir\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.442161 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-cnibin\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.442210 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-host-var-lib-kubelet\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.442335 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8b0a26bc-e371-4829-9e6f-95e93b1633e7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mh2cx\" (UID: \"8b0a26bc-e371-4829-9e6f-95e93b1633e7\") " pod="openshift-multus/multus-additional-cni-plugins-mh2cx" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.454133 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e9a53215-1d0d-47de-92c4-cea3209fe4fa-proxy-tls\") pod \"machine-config-daemon-8l67n\" (UID: \"e9a53215-1d0d-47de-92c4-cea3209fe4fa\") " pod="openshift-machine-config-operator/machine-config-daemon-8l67n" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.458319 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfldv\" (UniqueName: \"kubernetes.io/projected/8b0a26bc-e371-4829-9e6f-95e93b1633e7-kube-api-access-vfldv\") pod \"multus-additional-cni-plugins-mh2cx\" (UID: \"8b0a26bc-e371-4829-9e6f-95e93b1633e7\") " pod="openshift-multus/multus-additional-cni-plugins-mh2cx" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.463236 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpsw5\" (UniqueName: \"kubernetes.io/projected/21343843-2fbe-463b-b8a0-1efcaa504902-kube-api-access-lpsw5\") pod \"node-resolver-2jks9\" (UID: \"21343843-2fbe-463b-b8a0-1efcaa504902\") " pod="openshift-dns/node-resolver-2jks9" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.467658 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nghs7\" (UniqueName: \"kubernetes.io/projected/b7c16583-1b5a-4cef-9163-eb0b3e1440c1-kube-api-access-nghs7\") pod \"multus-627t7\" (UID: \"b7c16583-1b5a-4cef-9163-eb0b3e1440c1\") " pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.469113 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w7bc\" (UniqueName: \"kubernetes.io/projected/e9a53215-1d0d-47de-92c4-cea3209fe4fa-kube-api-access-2w7bc\") pod \"machine-config-daemon-8l67n\" (UID: \"e9a53215-1d0d-47de-92c4-cea3209fe4fa\") " pod="openshift-machine-config-operator/machine-config-daemon-8l67n" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.498009 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.498047 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.498056 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.498070 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.498079 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:11Z","lastTransitionTime":"2026-03-18T11:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.509709 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-p6xnj"] Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.510337 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-p6xnj" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.511913 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.512599 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.513609 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.513960 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.541997 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-host-run-ovn-kubernetes\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.542045 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-host-cni-bin\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.542069 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/40a065b9-60d2-47fd-8d74-78d53ae612a9-ovn-node-metrics-cert\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.542094 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-run-openvswitch\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.542106 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-host-run-ovn-kubernetes\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.542137 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-etc-openvswitch\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.542166 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-run-ovn\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.542195 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.542220 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/40a065b9-60d2-47fd-8d74-78d53ae612a9-ovnkube-script-lib\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.542252 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-host-run-netns\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.542272 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-log-socket\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.542295 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-host-slash\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.542316 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-var-lib-openvswitch\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.542338 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-run-systemd\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.542370 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-systemd-units\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.542392 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-host-cni-netd\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.542416 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4vbd\" (UniqueName: \"kubernetes.io/projected/40a065b9-60d2-47fd-8d74-78d53ae612a9-kube-api-access-v4vbd\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.542435 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-node-log\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.542473 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-host-kubelet\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.542496 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/40a065b9-60d2-47fd-8d74-78d53ae612a9-ovnkube-config\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.542528 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/40a065b9-60d2-47fd-8d74-78d53ae612a9-env-overrides\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.542633 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-host-cni-bin\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.542720 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-host-slash\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.542747 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-run-openvswitch\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.542985 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-etc-openvswitch\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.543010 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-run-ovn\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.543034 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.543076 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/40a065b9-60d2-47fd-8d74-78d53ae612a9-env-overrides\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.543142 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-var-lib-openvswitch\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.543180 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-run-systemd\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.543211 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-systemd-units\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.543238 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-host-cni-netd\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.543515 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-node-log\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.543541 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/40a065b9-60d2-47fd-8d74-78d53ae612a9-ovnkube-script-lib\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.543584 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-host-run-netns\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.543637 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-host-kubelet\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.543664 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/40a065b9-60d2-47fd-8d74-78d53ae612a9-log-socket\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.544001 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/40a065b9-60d2-47fd-8d74-78d53ae612a9-ovnkube-config\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.548131 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/40a065b9-60d2-47fd-8d74-78d53ae612a9-ovn-node-metrics-cert\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.560460 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4vbd\" (UniqueName: \"kubernetes.io/projected/40a065b9-60d2-47fd-8d74-78d53ae612a9-kube-api-access-v4vbd\") pod \"ovnkube-node-ts942\" (UID: \"40a065b9-60d2-47fd-8d74-78d53ae612a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.571133 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2jks9" Mar 18 11:58:11 crc kubenswrapper[4965]: W0318 11:58:11.582465 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21343843_2fbe_463b_b8a0_1efcaa504902.slice/crio-4dc7020f2651bdcd47c730cfc2060d1b7b370d50e25ba974ec1a3772a7ba8ad8 WatchSource:0}: Error finding container 4dc7020f2651bdcd47c730cfc2060d1b7b370d50e25ba974ec1a3772a7ba8ad8: Status 404 returned error can't find the container with id 4dc7020f2651bdcd47c730cfc2060d1b7b370d50e25ba974ec1a3772a7ba8ad8 Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.600811 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-8l67n" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.609035 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.609065 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.609075 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.609088 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.609097 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:11Z","lastTransitionTime":"2026-03-18T11:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.614911 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mh2cx" Mar 18 11:58:11 crc kubenswrapper[4965]: W0318 11:58:11.617925 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9a53215_1d0d_47de_92c4_cea3209fe4fa.slice/crio-7251665427b59df225e80844f750fd0396238435278624896b03b489939c0daf WatchSource:0}: Error finding container 7251665427b59df225e80844f750fd0396238435278624896b03b489939c0daf: Status 404 returned error can't find the container with id 7251665427b59df225e80844f750fd0396238435278624896b03b489939c0daf Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.624715 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-627t7" Mar 18 11:58:11 crc kubenswrapper[4965]: W0318 11:58:11.625535 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b0a26bc_e371_4829_9e6f_95e93b1633e7.slice/crio-163a74ae5d438d8d03b23b32bcd0657dcfe0beb54174b17afcfbb6849b8600b6 WatchSource:0}: Error finding container 163a74ae5d438d8d03b23b32bcd0657dcfe0beb54174b17afcfbb6849b8600b6: Status 404 returned error can't find the container with id 163a74ae5d438d8d03b23b32bcd0657dcfe0beb54174b17afcfbb6849b8600b6 Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.640546 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.642901 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1a7a17d5-a852-4acf-b2b0-7f5beae9e681-host\") pod \"node-ca-p6xnj\" (UID: \"1a7a17d5-a852-4acf-b2b0-7f5beae9e681\") " pod="openshift-image-registry/node-ca-p6xnj" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.642929 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1a7a17d5-a852-4acf-b2b0-7f5beae9e681-serviceca\") pod \"node-ca-p6xnj\" (UID: \"1a7a17d5-a852-4acf-b2b0-7f5beae9e681\") " pod="openshift-image-registry/node-ca-p6xnj" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.642955 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngwl5\" (UniqueName: \"kubernetes.io/projected/1a7a17d5-a852-4acf-b2b0-7f5beae9e681-kube-api-access-ngwl5\") pod \"node-ca-p6xnj\" (UID: \"1a7a17d5-a852-4acf-b2b0-7f5beae9e681\") " pod="openshift-image-registry/node-ca-p6xnj" Mar 18 11:58:11 crc kubenswrapper[4965]: W0318 11:58:11.670172 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40a065b9_60d2_47fd_8d74_78d53ae612a9.slice/crio-5117e214f55f5f454bb0378d8da3a60e9a026c40894ed02565a14686e8a7cb73 WatchSource:0}: Error finding container 5117e214f55f5f454bb0378d8da3a60e9a026c40894ed02565a14686e8a7cb73: Status 404 returned error can't find the container with id 5117e214f55f5f454bb0378d8da3a60e9a026c40894ed02565a14686e8a7cb73 Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.677757 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c5tsj"] Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.679708 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c5tsj" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.681553 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.682295 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.693828 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-9jx9z"] Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.694518 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9jx9z" Mar 18 11:58:11 crc kubenswrapper[4965]: E0318 11:58:11.694588 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9jx9z" podUID="1b676c03-201d-403c-8082-84451760c106" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.717297 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.717337 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.717346 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.717360 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.717371 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:11Z","lastTransitionTime":"2026-03-18T11:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.743524 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/494789a7-a004-4ae6-81f2-22630e1d0ae4-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-c5tsj\" (UID: \"494789a7-a004-4ae6-81f2-22630e1d0ae4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c5tsj" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.743558 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/494789a7-a004-4ae6-81f2-22630e1d0ae4-env-overrides\") pod \"ovnkube-control-plane-749d76644c-c5tsj\" (UID: \"494789a7-a004-4ae6-81f2-22630e1d0ae4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c5tsj" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.743600 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdh9k\" (UniqueName: \"kubernetes.io/projected/494789a7-a004-4ae6-81f2-22630e1d0ae4-kube-api-access-hdh9k\") pod \"ovnkube-control-plane-749d76644c-c5tsj\" (UID: \"494789a7-a004-4ae6-81f2-22630e1d0ae4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c5tsj" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.743628 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1a7a17d5-a852-4acf-b2b0-7f5beae9e681-host\") pod \"node-ca-p6xnj\" (UID: \"1a7a17d5-a852-4acf-b2b0-7f5beae9e681\") " pod="openshift-image-registry/node-ca-p6xnj" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.743643 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1a7a17d5-a852-4acf-b2b0-7f5beae9e681-serviceca\") pod \"node-ca-p6xnj\" (UID: \"1a7a17d5-a852-4acf-b2b0-7f5beae9e681\") " pod="openshift-image-registry/node-ca-p6xnj" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.743663 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngwl5\" (UniqueName: \"kubernetes.io/projected/1a7a17d5-a852-4acf-b2b0-7f5beae9e681-kube-api-access-ngwl5\") pod \"node-ca-p6xnj\" (UID: \"1a7a17d5-a852-4acf-b2b0-7f5beae9e681\") " pod="openshift-image-registry/node-ca-p6xnj" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.743713 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b676c03-201d-403c-8082-84451760c106-metrics-certs\") pod \"network-metrics-daemon-9jx9z\" (UID: \"1b676c03-201d-403c-8082-84451760c106\") " pod="openshift-multus/network-metrics-daemon-9jx9z" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.743757 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbt9g\" (UniqueName: \"kubernetes.io/projected/1b676c03-201d-403c-8082-84451760c106-kube-api-access-jbt9g\") pod \"network-metrics-daemon-9jx9z\" (UID: \"1b676c03-201d-403c-8082-84451760c106\") " pod="openshift-multus/network-metrics-daemon-9jx9z" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.743787 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/494789a7-a004-4ae6-81f2-22630e1d0ae4-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-c5tsj\" (UID: \"494789a7-a004-4ae6-81f2-22630e1d0ae4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c5tsj" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.743906 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1a7a17d5-a852-4acf-b2b0-7f5beae9e681-host\") pod \"node-ca-p6xnj\" (UID: \"1a7a17d5-a852-4acf-b2b0-7f5beae9e681\") " pod="openshift-image-registry/node-ca-p6xnj" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.744838 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1a7a17d5-a852-4acf-b2b0-7f5beae9e681-serviceca\") pod \"node-ca-p6xnj\" (UID: \"1a7a17d5-a852-4acf-b2b0-7f5beae9e681\") " pod="openshift-image-registry/node-ca-p6xnj" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.767069 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngwl5\" (UniqueName: \"kubernetes.io/projected/1a7a17d5-a852-4acf-b2b0-7f5beae9e681-kube-api-access-ngwl5\") pod \"node-ca-p6xnj\" (UID: \"1a7a17d5-a852-4acf-b2b0-7f5beae9e681\") " pod="openshift-image-registry/node-ca-p6xnj" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.819331 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.819463 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.819534 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.819600 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.819676 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:11Z","lastTransitionTime":"2026-03-18T11:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.823899 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-p6xnj" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.844226 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b676c03-201d-403c-8082-84451760c106-metrics-certs\") pod \"network-metrics-daemon-9jx9z\" (UID: \"1b676c03-201d-403c-8082-84451760c106\") " pod="openshift-multus/network-metrics-daemon-9jx9z" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.844271 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbt9g\" (UniqueName: \"kubernetes.io/projected/1b676c03-201d-403c-8082-84451760c106-kube-api-access-jbt9g\") pod \"network-metrics-daemon-9jx9z\" (UID: \"1b676c03-201d-403c-8082-84451760c106\") " pod="openshift-multus/network-metrics-daemon-9jx9z" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.844296 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/494789a7-a004-4ae6-81f2-22630e1d0ae4-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-c5tsj\" (UID: \"494789a7-a004-4ae6-81f2-22630e1d0ae4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c5tsj" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.844331 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/494789a7-a004-4ae6-81f2-22630e1d0ae4-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-c5tsj\" (UID: \"494789a7-a004-4ae6-81f2-22630e1d0ae4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c5tsj" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.844350 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/494789a7-a004-4ae6-81f2-22630e1d0ae4-env-overrides\") pod \"ovnkube-control-plane-749d76644c-c5tsj\" (UID: \"494789a7-a004-4ae6-81f2-22630e1d0ae4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c5tsj" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.844377 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdh9k\" (UniqueName: \"kubernetes.io/projected/494789a7-a004-4ae6-81f2-22630e1d0ae4-kube-api-access-hdh9k\") pod \"ovnkube-control-plane-749d76644c-c5tsj\" (UID: \"494789a7-a004-4ae6-81f2-22630e1d0ae4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c5tsj" Mar 18 11:58:11 crc kubenswrapper[4965]: E0318 11:58:11.844450 4965 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 11:58:11 crc kubenswrapper[4965]: E0318 11:58:11.844547 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b676c03-201d-403c-8082-84451760c106-metrics-certs podName:1b676c03-201d-403c-8082-84451760c106 nodeName:}" failed. No retries permitted until 2026-03-18 11:58:12.344527993 +0000 UTC m=+97.330715472 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b676c03-201d-403c-8082-84451760c106-metrics-certs") pod "network-metrics-daemon-9jx9z" (UID: "1b676c03-201d-403c-8082-84451760c106") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.845583 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/494789a7-a004-4ae6-81f2-22630e1d0ae4-env-overrides\") pod \"ovnkube-control-plane-749d76644c-c5tsj\" (UID: \"494789a7-a004-4ae6-81f2-22630e1d0ae4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c5tsj" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.845764 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/494789a7-a004-4ae6-81f2-22630e1d0ae4-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-c5tsj\" (UID: \"494789a7-a004-4ae6-81f2-22630e1d0ae4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c5tsj" Mar 18 11:58:11 crc kubenswrapper[4965]: W0318 11:58:11.846889 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a7a17d5_a852_4acf_b2b0_7f5beae9e681.slice/crio-16fb588d4bc50d80aff59fa94ad4945417983f382f17a8e9c7f30bf7ec62c27c WatchSource:0}: Error finding container 16fb588d4bc50d80aff59fa94ad4945417983f382f17a8e9c7f30bf7ec62c27c: Status 404 returned error can't find the container with id 16fb588d4bc50d80aff59fa94ad4945417983f382f17a8e9c7f30bf7ec62c27c Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.849251 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/494789a7-a004-4ae6-81f2-22630e1d0ae4-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-c5tsj\" (UID: \"494789a7-a004-4ae6-81f2-22630e1d0ae4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c5tsj" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.864116 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdh9k\" (UniqueName: \"kubernetes.io/projected/494789a7-a004-4ae6-81f2-22630e1d0ae4-kube-api-access-hdh9k\") pod \"ovnkube-control-plane-749d76644c-c5tsj\" (UID: \"494789a7-a004-4ae6-81f2-22630e1d0ae4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c5tsj" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.864383 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbt9g\" (UniqueName: \"kubernetes.io/projected/1b676c03-201d-403c-8082-84451760c106-kube-api-access-jbt9g\") pod \"network-metrics-daemon-9jx9z\" (UID: \"1b676c03-201d-403c-8082-84451760c106\") " pod="openshift-multus/network-metrics-daemon-9jx9z" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.922408 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.922460 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.922475 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.922492 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:11 crc kubenswrapper[4965]: I0318 11:58:11.922504 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:11Z","lastTransitionTime":"2026-03-18T11:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.008358 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c5tsj" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.024123 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.024153 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.024165 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.024194 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.024206 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:12Z","lastTransitionTime":"2026-03-18T11:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.127640 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.127690 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.127699 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.127713 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.127722 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:12Z","lastTransitionTime":"2026-03-18T11:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.230125 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.230162 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.230173 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.230188 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.230198 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:12Z","lastTransitionTime":"2026-03-18T11:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.332757 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.332803 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.332812 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.332827 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.332838 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:12Z","lastTransitionTime":"2026-03-18T11:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.349784 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b676c03-201d-403c-8082-84451760c106-metrics-certs\") pod \"network-metrics-daemon-9jx9z\" (UID: \"1b676c03-201d-403c-8082-84451760c106\") " pod="openshift-multus/network-metrics-daemon-9jx9z" Mar 18 11:58:12 crc kubenswrapper[4965]: E0318 11:58:12.349946 4965 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 11:58:12 crc kubenswrapper[4965]: E0318 11:58:12.350003 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b676c03-201d-403c-8082-84451760c106-metrics-certs podName:1b676c03-201d-403c-8082-84451760c106 nodeName:}" failed. No retries permitted until 2026-03-18 11:58:13.34998658 +0000 UTC m=+98.336174169 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b676c03-201d-403c-8082-84451760c106-metrics-certs") pod "network-metrics-daemon-9jx9z" (UID: "1b676c03-201d-403c-8082-84451760c106") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.414455 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c5tsj" event={"ID":"494789a7-a004-4ae6-81f2-22630e1d0ae4","Type":"ContainerStarted","Data":"ef52ac5eea572e0e15aebc9711d065769356fbd3eb3b65c5d23370d2071d1365"} Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.414511 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c5tsj" event={"ID":"494789a7-a004-4ae6-81f2-22630e1d0ae4","Type":"ContainerStarted","Data":"9dfd8ec5fb724e211ee73116379b7191708640184db89c594d9c48de79334a5f"} Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.414525 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c5tsj" event={"ID":"494789a7-a004-4ae6-81f2-22630e1d0ae4","Type":"ContainerStarted","Data":"464ea203bd07df8f27564b9bb642d298380afe5083ad77434acf7b384e4d6910"} Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.415549 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2jks9" event={"ID":"21343843-2fbe-463b-b8a0-1efcaa504902","Type":"ContainerStarted","Data":"84c67e726e08ab826526e840d0b0bb36c932bc39c015f72911ec8f0bbc8573ea"} Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.415579 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2jks9" event={"ID":"21343843-2fbe-463b-b8a0-1efcaa504902","Type":"ContainerStarted","Data":"4dc7020f2651bdcd47c730cfc2060d1b7b370d50e25ba974ec1a3772a7ba8ad8"} Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.417316 4965 generic.go:334] "Generic (PLEG): container finished" podID="40a065b9-60d2-47fd-8d74-78d53ae612a9" containerID="5d875da27076e4fedd4163ae2f2a0a96dee16b57225770350eef9122c75efee6" exitCode=0 Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.417387 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ts942" event={"ID":"40a065b9-60d2-47fd-8d74-78d53ae612a9","Type":"ContainerDied","Data":"5d875da27076e4fedd4163ae2f2a0a96dee16b57225770350eef9122c75efee6"} Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.417446 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ts942" event={"ID":"40a065b9-60d2-47fd-8d74-78d53ae612a9","Type":"ContainerStarted","Data":"5117e214f55f5f454bb0378d8da3a60e9a026c40894ed02565a14686e8a7cb73"} Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.420354 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8l67n" event={"ID":"e9a53215-1d0d-47de-92c4-cea3209fe4fa","Type":"ContainerStarted","Data":"07ccfd2ee711357131904692bdda1cad818f362cf556bbe446d974c43dd93e2c"} Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.420406 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8l67n" event={"ID":"e9a53215-1d0d-47de-92c4-cea3209fe4fa","Type":"ContainerStarted","Data":"ee65076d3fbf5f0628fdea60bce51ce496c09b690ec6c4b4248352a70b8fb132"} Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.420418 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8l67n" event={"ID":"e9a53215-1d0d-47de-92c4-cea3209fe4fa","Type":"ContainerStarted","Data":"7251665427b59df225e80844f750fd0396238435278624896b03b489939c0daf"} Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.422147 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-p6xnj" event={"ID":"1a7a17d5-a852-4acf-b2b0-7f5beae9e681","Type":"ContainerStarted","Data":"9b7ba5a5ac37a871d8c447f065f0f5219409666b61488e5facabb1c867ad53a5"} Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.422194 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-p6xnj" event={"ID":"1a7a17d5-a852-4acf-b2b0-7f5beae9e681","Type":"ContainerStarted","Data":"16fb588d4bc50d80aff59fa94ad4945417983f382f17a8e9c7f30bf7ec62c27c"} Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.424099 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-627t7" event={"ID":"b7c16583-1b5a-4cef-9163-eb0b3e1440c1","Type":"ContainerStarted","Data":"a8a2ed43153cd8d90909b44faaea4f752b1d5e0fff1c751e6dc79dd809133a49"} Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.424237 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-627t7" event={"ID":"b7c16583-1b5a-4cef-9163-eb0b3e1440c1","Type":"ContainerStarted","Data":"1db950fde5b27697f6aaa1afd44cec5aa958d4f51b9c5f99e0e4fce75176a38b"} Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.426783 4965 generic.go:334] "Generic (PLEG): container finished" podID="8b0a26bc-e371-4829-9e6f-95e93b1633e7" containerID="f20e153d0e7d406bc7881c723e36a56868fbb817c2f817d494aaf05523fe0257" exitCode=0 Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.426853 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mh2cx" event={"ID":"8b0a26bc-e371-4829-9e6f-95e93b1633e7","Type":"ContainerDied","Data":"f20e153d0e7d406bc7881c723e36a56868fbb817c2f817d494aaf05523fe0257"} Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.426906 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mh2cx" event={"ID":"8b0a26bc-e371-4829-9e6f-95e93b1633e7","Type":"ContainerStarted","Data":"163a74ae5d438d8d03b23b32bcd0657dcfe0beb54174b17afcfbb6849b8600b6"} Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.440168 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.440338 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.440419 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.440485 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.440542 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:12Z","lastTransitionTime":"2026-03-18T11:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.449132 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-2jks9" podStartSLOduration=36.449112351 podStartE2EDuration="36.449112351s" podCreationTimestamp="2026-03-18 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:12.434508199 +0000 UTC m=+97.420695698" watchObservedRunningTime="2026-03-18 11:58:12.449112351 +0000 UTC m=+97.435299830" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.449317 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-8l67n" podStartSLOduration=36.449313387 podStartE2EDuration="36.449313387s" podCreationTimestamp="2026-03-18 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:12.44869971 +0000 UTC m=+97.434887189" watchObservedRunningTime="2026-03-18 11:58:12.449313387 +0000 UTC m=+97.435500856" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.499892 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-627t7" podStartSLOduration=36.49986767 podStartE2EDuration="36.49986767s" podCreationTimestamp="2026-03-18 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:12.472580068 +0000 UTC m=+97.458767557" watchObservedRunningTime="2026-03-18 11:58:12.49986767 +0000 UTC m=+97.486055149" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.543128 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.543171 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.543181 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.543198 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.543212 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:12Z","lastTransitionTime":"2026-03-18T11:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.557321 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-p6xnj" podStartSLOduration=36.557296541 podStartE2EDuration="36.557296541s" podCreationTimestamp="2026-03-18 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:12.555626895 +0000 UTC m=+97.541814394" watchObservedRunningTime="2026-03-18 11:58:12.557296541 +0000 UTC m=+97.543484020" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.646531 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.646711 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.646806 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.646890 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.646965 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:12Z","lastTransitionTime":"2026-03-18T11:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.750746 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.751172 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.751186 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.751200 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.751209 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:12Z","lastTransitionTime":"2026-03-18T11:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.854148 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.854196 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.854211 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.854228 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.854238 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:12Z","lastTransitionTime":"2026-03-18T11:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.957405 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.957448 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.957458 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.957473 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:12 crc kubenswrapper[4965]: I0318 11:58:12.957486 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:12Z","lastTransitionTime":"2026-03-18T11:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.020395 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9jx9z" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.020433 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.020482 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 11:58:13 crc kubenswrapper[4965]: E0318 11:58:13.020545 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9jx9z" podUID="1b676c03-201d-403c-8082-84451760c106" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.020593 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 11:58:13 crc kubenswrapper[4965]: E0318 11:58:13.020717 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 11:58:13 crc kubenswrapper[4965]: E0318 11:58:13.020785 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 11:58:13 crc kubenswrapper[4965]: E0318 11:58:13.020825 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.060157 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.060202 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.060213 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.060233 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.060247 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:13Z","lastTransitionTime":"2026-03-18T11:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.162993 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.163422 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.163433 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.163448 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.163458 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:13Z","lastTransitionTime":"2026-03-18T11:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.265581 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.265638 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.265655 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.265686 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.265698 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:13Z","lastTransitionTime":"2026-03-18T11:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.360554 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b676c03-201d-403c-8082-84451760c106-metrics-certs\") pod \"network-metrics-daemon-9jx9z\" (UID: \"1b676c03-201d-403c-8082-84451760c106\") " pod="openshift-multus/network-metrics-daemon-9jx9z" Mar 18 11:58:13 crc kubenswrapper[4965]: E0318 11:58:13.360750 4965 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 11:58:13 crc kubenswrapper[4965]: E0318 11:58:13.360871 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b676c03-201d-403c-8082-84451760c106-metrics-certs podName:1b676c03-201d-403c-8082-84451760c106 nodeName:}" failed. No retries permitted until 2026-03-18 11:58:15.360843172 +0000 UTC m=+100.347030691 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b676c03-201d-403c-8082-84451760c106-metrics-certs") pod "network-metrics-daemon-9jx9z" (UID: "1b676c03-201d-403c-8082-84451760c106") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.369343 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.369399 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.369418 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.369440 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.369461 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:13Z","lastTransitionTime":"2026-03-18T11:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.435889 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ts942" event={"ID":"40a065b9-60d2-47fd-8d74-78d53ae612a9","Type":"ContainerStarted","Data":"0d1653212a5316077db1313fce440e3c65d9fe4bc4972b3d52123c0473ec3176"} Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.435968 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ts942" event={"ID":"40a065b9-60d2-47fd-8d74-78d53ae612a9","Type":"ContainerStarted","Data":"cc419a5d024e12b9edba1bc9aa26d529827160520006f3b98699b65b3f8c9872"} Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.435998 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ts942" event={"ID":"40a065b9-60d2-47fd-8d74-78d53ae612a9","Type":"ContainerStarted","Data":"a243977785100c3d1eb4e6328a9d6d833235898a0f208ed70bd9133ae7d8e142"} Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.436023 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ts942" event={"ID":"40a065b9-60d2-47fd-8d74-78d53ae612a9","Type":"ContainerStarted","Data":"877f59b10495fe171bc15c82c42e1f987fde76133fddfa345bf07b6aaa4ca8ef"} Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.442461 4965 generic.go:334] "Generic (PLEG): container finished" podID="8b0a26bc-e371-4829-9e6f-95e93b1633e7" containerID="4bd88846ed07be0bf0af0091ae025875c2fdef92db7236009cc857017fafef1b" exitCode=0 Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.443343 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mh2cx" event={"ID":"8b0a26bc-e371-4829-9e6f-95e93b1633e7","Type":"ContainerDied","Data":"4bd88846ed07be0bf0af0091ae025875c2fdef92db7236009cc857017fafef1b"} Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.472495 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.472589 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.472639 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.472689 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.472707 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:13Z","lastTransitionTime":"2026-03-18T11:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.486965 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c5tsj" podStartSLOduration=36.486946216 podStartE2EDuration="36.486946216s" podCreationTimestamp="2026-03-18 11:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:13.486473483 +0000 UTC m=+98.472660962" watchObservedRunningTime="2026-03-18 11:58:13.486946216 +0000 UTC m=+98.473133695" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.576437 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.576483 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.576492 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.576504 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.576512 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:13Z","lastTransitionTime":"2026-03-18T11:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.683330 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.683789 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.683801 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.683815 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.683825 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:13Z","lastTransitionTime":"2026-03-18T11:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.786585 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.786628 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.786639 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.786687 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.786711 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:13Z","lastTransitionTime":"2026-03-18T11:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.788550 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.788587 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.788597 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.788611 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.788621 4965 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T11:58:13Z","lastTransitionTime":"2026-03-18T11:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.827012 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-2hm2g"] Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.827468 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2hm2g" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.829265 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.829372 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.829847 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.830352 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.968334 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c88fbb0c-f9cd-40c7-be17-9f6e7238bcca-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2hm2g\" (UID: \"c88fbb0c-f9cd-40c7-be17-9f6e7238bcca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2hm2g" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.968384 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c88fbb0c-f9cd-40c7-be17-9f6e7238bcca-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2hm2g\" (UID: \"c88fbb0c-f9cd-40c7-be17-9f6e7238bcca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2hm2g" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.968409 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c88fbb0c-f9cd-40c7-be17-9f6e7238bcca-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2hm2g\" (UID: \"c88fbb0c-f9cd-40c7-be17-9f6e7238bcca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2hm2g" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.968544 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c88fbb0c-f9cd-40c7-be17-9f6e7238bcca-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2hm2g\" (UID: \"c88fbb0c-f9cd-40c7-be17-9f6e7238bcca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2hm2g" Mar 18 11:58:13 crc kubenswrapper[4965]: I0318 11:58:13.968622 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c88fbb0c-f9cd-40c7-be17-9f6e7238bcca-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2hm2g\" (UID: \"c88fbb0c-f9cd-40c7-be17-9f6e7238bcca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2hm2g" Mar 18 11:58:14 crc kubenswrapper[4965]: I0318 11:58:14.018115 4965 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 18 11:58:14 crc kubenswrapper[4965]: I0318 11:58:14.028978 4965 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 18 11:58:14 crc kubenswrapper[4965]: I0318 11:58:14.069646 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c88fbb0c-f9cd-40c7-be17-9f6e7238bcca-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2hm2g\" (UID: \"c88fbb0c-f9cd-40c7-be17-9f6e7238bcca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2hm2g" Mar 18 11:58:14 crc kubenswrapper[4965]: I0318 11:58:14.069776 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c88fbb0c-f9cd-40c7-be17-9f6e7238bcca-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2hm2g\" (UID: \"c88fbb0c-f9cd-40c7-be17-9f6e7238bcca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2hm2g" Mar 18 11:58:14 crc kubenswrapper[4965]: I0318 11:58:14.069827 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c88fbb0c-f9cd-40c7-be17-9f6e7238bcca-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2hm2g\" (UID: \"c88fbb0c-f9cd-40c7-be17-9f6e7238bcca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2hm2g" Mar 18 11:58:14 crc kubenswrapper[4965]: I0318 11:58:14.069858 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c88fbb0c-f9cd-40c7-be17-9f6e7238bcca-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2hm2g\" (UID: \"c88fbb0c-f9cd-40c7-be17-9f6e7238bcca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2hm2g" Mar 18 11:58:14 crc kubenswrapper[4965]: I0318 11:58:14.069976 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c88fbb0c-f9cd-40c7-be17-9f6e7238bcca-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2hm2g\" (UID: \"c88fbb0c-f9cd-40c7-be17-9f6e7238bcca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2hm2g" Mar 18 11:58:14 crc kubenswrapper[4965]: I0318 11:58:14.070025 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c88fbb0c-f9cd-40c7-be17-9f6e7238bcca-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2hm2g\" (UID: \"c88fbb0c-f9cd-40c7-be17-9f6e7238bcca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2hm2g" Mar 18 11:58:14 crc kubenswrapper[4965]: I0318 11:58:14.070040 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c88fbb0c-f9cd-40c7-be17-9f6e7238bcca-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2hm2g\" (UID: \"c88fbb0c-f9cd-40c7-be17-9f6e7238bcca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2hm2g" Mar 18 11:58:14 crc kubenswrapper[4965]: I0318 11:58:14.071835 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c88fbb0c-f9cd-40c7-be17-9f6e7238bcca-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2hm2g\" (UID: \"c88fbb0c-f9cd-40c7-be17-9f6e7238bcca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2hm2g" Mar 18 11:58:14 crc kubenswrapper[4965]: I0318 11:58:14.077304 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c88fbb0c-f9cd-40c7-be17-9f6e7238bcca-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2hm2g\" (UID: \"c88fbb0c-f9cd-40c7-be17-9f6e7238bcca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2hm2g" Mar 18 11:58:14 crc kubenswrapper[4965]: I0318 11:58:14.089830 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c88fbb0c-f9cd-40c7-be17-9f6e7238bcca-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2hm2g\" (UID: \"c88fbb0c-f9cd-40c7-be17-9f6e7238bcca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2hm2g" Mar 18 11:58:14 crc kubenswrapper[4965]: I0318 11:58:14.142411 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2hm2g" Mar 18 11:58:14 crc kubenswrapper[4965]: W0318 11:58:14.162286 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc88fbb0c_f9cd_40c7_be17_9f6e7238bcca.slice/crio-714647ca0bb6632c21e3c8f1417c0e8097c479ce08df69fae5605e80a001e8b8 WatchSource:0}: Error finding container 714647ca0bb6632c21e3c8f1417c0e8097c479ce08df69fae5605e80a001e8b8: Status 404 returned error can't find the container with id 714647ca0bb6632c21e3c8f1417c0e8097c479ce08df69fae5605e80a001e8b8 Mar 18 11:58:14 crc kubenswrapper[4965]: I0318 11:58:14.452133 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ts942" event={"ID":"40a065b9-60d2-47fd-8d74-78d53ae612a9","Type":"ContainerStarted","Data":"5ea39919e33f667f5d711d0b39471d905cbad9315351504d7558e7c998feee86"} Mar 18 11:58:14 crc kubenswrapper[4965]: I0318 11:58:14.452716 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ts942" event={"ID":"40a065b9-60d2-47fd-8d74-78d53ae612a9","Type":"ContainerStarted","Data":"382b2e2e35758bd462058b2c93265427615be3bfda7f431b8b44dcbf62bcb809"} Mar 18 11:58:14 crc kubenswrapper[4965]: I0318 11:58:14.453853 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2hm2g" event={"ID":"c88fbb0c-f9cd-40c7-be17-9f6e7238bcca","Type":"ContainerStarted","Data":"01fcb05871be68c98a1addf5a8bfe56bc0db22270e96b10a6262de2a689beabd"} Mar 18 11:58:14 crc kubenswrapper[4965]: I0318 11:58:14.453914 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2hm2g" event={"ID":"c88fbb0c-f9cd-40c7-be17-9f6e7238bcca","Type":"ContainerStarted","Data":"714647ca0bb6632c21e3c8f1417c0e8097c479ce08df69fae5605e80a001e8b8"} Mar 18 11:58:14 crc kubenswrapper[4965]: I0318 11:58:14.458113 4965 generic.go:334] "Generic (PLEG): container finished" podID="8b0a26bc-e371-4829-9e6f-95e93b1633e7" containerID="74ee469569b5864f7b0724aa6276e70843ab4c5dd854787c16381d4f1eb95716" exitCode=0 Mar 18 11:58:14 crc kubenswrapper[4965]: I0318 11:58:14.458185 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mh2cx" event={"ID":"8b0a26bc-e371-4829-9e6f-95e93b1633e7","Type":"ContainerDied","Data":"74ee469569b5864f7b0724aa6276e70843ab4c5dd854787c16381d4f1eb95716"} Mar 18 11:58:14 crc kubenswrapper[4965]: I0318 11:58:14.471364 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2hm2g" podStartSLOduration=38.4713426 podStartE2EDuration="38.4713426s" podCreationTimestamp="2026-03-18 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:14.470732253 +0000 UTC m=+99.456919742" watchObservedRunningTime="2026-03-18 11:58:14.4713426 +0000 UTC m=+99.457530079" Mar 18 11:58:14 crc kubenswrapper[4965]: I0318 11:58:14.676422 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:14 crc kubenswrapper[4965]: E0318 11:58:14.676514 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:22.676487542 +0000 UTC m=+107.662675021 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:14 crc kubenswrapper[4965]: I0318 11:58:14.777426 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 11:58:14 crc kubenswrapper[4965]: E0318 11:58:14.777743 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 11:58:14 crc kubenswrapper[4965]: E0318 11:58:14.777784 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 11:58:14 crc kubenswrapper[4965]: E0318 11:58:14.777803 4965 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 11:58:14 crc kubenswrapper[4965]: I0318 11:58:14.778050 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 11:58:14 crc kubenswrapper[4965]: E0318 11:58:14.778176 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 11:58:22.778122422 +0000 UTC m=+107.764309911 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 11:58:14 crc kubenswrapper[4965]: I0318 11:58:14.778255 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 11:58:14 crc kubenswrapper[4965]: E0318 11:58:14.778512 4965 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 11:58:14 crc kubenswrapper[4965]: I0318 11:58:14.778567 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 11:58:14 crc kubenswrapper[4965]: E0318 11:58:14.778608 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 11:58:22.778585015 +0000 UTC m=+107.764772564 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 11:58:14 crc kubenswrapper[4965]: E0318 11:58:14.778733 4965 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 11:58:14 crc kubenswrapper[4965]: E0318 11:58:14.778797 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 11:58:22.778785591 +0000 UTC m=+107.764973080 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 11:58:14 crc kubenswrapper[4965]: E0318 11:58:14.778905 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 11:58:14 crc kubenswrapper[4965]: E0318 11:58:14.778923 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 11:58:14 crc kubenswrapper[4965]: E0318 11:58:14.778939 4965 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 11:58:14 crc kubenswrapper[4965]: E0318 11:58:14.778982 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 11:58:22.778970786 +0000 UTC m=+107.765158375 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 11:58:15 crc kubenswrapper[4965]: I0318 11:58:15.020797 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 11:58:15 crc kubenswrapper[4965]: I0318 11:58:15.020843 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 11:58:15 crc kubenswrapper[4965]: I0318 11:58:15.020884 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9jx9z" Mar 18 11:58:15 crc kubenswrapper[4965]: E0318 11:58:15.022354 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9jx9z" podUID="1b676c03-201d-403c-8082-84451760c106" Mar 18 11:58:15 crc kubenswrapper[4965]: E0318 11:58:15.022217 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 11:58:15 crc kubenswrapper[4965]: E0318 11:58:15.022074 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 11:58:15 crc kubenswrapper[4965]: I0318 11:58:15.022928 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 11:58:15 crc kubenswrapper[4965]: E0318 11:58:15.023124 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 11:58:15 crc kubenswrapper[4965]: I0318 11:58:15.384218 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b676c03-201d-403c-8082-84451760c106-metrics-certs\") pod \"network-metrics-daemon-9jx9z\" (UID: \"1b676c03-201d-403c-8082-84451760c106\") " pod="openshift-multus/network-metrics-daemon-9jx9z" Mar 18 11:58:15 crc kubenswrapper[4965]: E0318 11:58:15.384480 4965 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 11:58:15 crc kubenswrapper[4965]: E0318 11:58:15.384570 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b676c03-201d-403c-8082-84451760c106-metrics-certs podName:1b676c03-201d-403c-8082-84451760c106 nodeName:}" failed. No retries permitted until 2026-03-18 11:58:19.384540352 +0000 UTC m=+104.370727871 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b676c03-201d-403c-8082-84451760c106-metrics-certs") pod "network-metrics-daemon-9jx9z" (UID: "1b676c03-201d-403c-8082-84451760c106") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 11:58:15 crc kubenswrapper[4965]: I0318 11:58:15.466191 4965 generic.go:334] "Generic (PLEG): container finished" podID="8b0a26bc-e371-4829-9e6f-95e93b1633e7" containerID="bdf81eb2cb4243b80f418a31e50599cb40162920c4b3fb7ed0c9fdba7befa0b3" exitCode=0 Mar 18 11:58:15 crc kubenswrapper[4965]: I0318 11:58:15.466266 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mh2cx" event={"ID":"8b0a26bc-e371-4829-9e6f-95e93b1633e7","Type":"ContainerDied","Data":"bdf81eb2cb4243b80f418a31e50599cb40162920c4b3fb7ed0c9fdba7befa0b3"} Mar 18 11:58:16 crc kubenswrapper[4965]: I0318 11:58:16.473439 4965 generic.go:334] "Generic (PLEG): container finished" podID="8b0a26bc-e371-4829-9e6f-95e93b1633e7" containerID="6f78e7f0335f13e4f1351219cbe52d85980669976b0b56813b6ba17bd8d2bbe7" exitCode=0 Mar 18 11:58:16 crc kubenswrapper[4965]: I0318 11:58:16.473562 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mh2cx" event={"ID":"8b0a26bc-e371-4829-9e6f-95e93b1633e7","Type":"ContainerDied","Data":"6f78e7f0335f13e4f1351219cbe52d85980669976b0b56813b6ba17bd8d2bbe7"} Mar 18 11:58:16 crc kubenswrapper[4965]: I0318 11:58:16.483067 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ts942" event={"ID":"40a065b9-60d2-47fd-8d74-78d53ae612a9","Type":"ContainerStarted","Data":"9ff520af9b2770e9d37de3a812723b44a063f17b2fcdcd8e5f085fe5c4f578c9"} Mar 18 11:58:17 crc kubenswrapper[4965]: I0318 11:58:17.020697 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9jx9z" Mar 18 11:58:17 crc kubenswrapper[4965]: I0318 11:58:17.020781 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 11:58:17 crc kubenswrapper[4965]: I0318 11:58:17.020827 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 11:58:17 crc kubenswrapper[4965]: I0318 11:58:17.020781 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 11:58:17 crc kubenswrapper[4965]: E0318 11:58:17.021020 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9jx9z" podUID="1b676c03-201d-403c-8082-84451760c106" Mar 18 11:58:17 crc kubenswrapper[4965]: E0318 11:58:17.021194 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 11:58:17 crc kubenswrapper[4965]: E0318 11:58:17.021333 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 11:58:17 crc kubenswrapper[4965]: E0318 11:58:17.021452 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 11:58:17 crc kubenswrapper[4965]: I0318 11:58:17.492280 4965 generic.go:334] "Generic (PLEG): container finished" podID="8b0a26bc-e371-4829-9e6f-95e93b1633e7" containerID="16ea26c951a0fa24b1dbf6e2783f467226a7c7bc2a832cc767c98171b1172758" exitCode=0 Mar 18 11:58:17 crc kubenswrapper[4965]: I0318 11:58:17.492330 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mh2cx" event={"ID":"8b0a26bc-e371-4829-9e6f-95e93b1633e7","Type":"ContainerDied","Data":"16ea26c951a0fa24b1dbf6e2783f467226a7c7bc2a832cc767c98171b1172758"} Mar 18 11:58:18 crc kubenswrapper[4965]: I0318 11:58:18.501229 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mh2cx" event={"ID":"8b0a26bc-e371-4829-9e6f-95e93b1633e7","Type":"ContainerStarted","Data":"3d7f8ecb589ae3a579de4907511004ac1121377c44226c42d4c93cdd608755bc"} Mar 18 11:58:18 crc kubenswrapper[4965]: I0318 11:58:18.504918 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ts942" event={"ID":"40a065b9-60d2-47fd-8d74-78d53ae612a9","Type":"ContainerStarted","Data":"2c9c3140258b64c5eb4ef18a73d73b9bb4a0c4b1e0e5289e9f3b2f43fbcc2124"} Mar 18 11:58:18 crc kubenswrapper[4965]: I0318 11:58:18.513068 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:18 crc kubenswrapper[4965]: I0318 11:58:18.513121 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:18 crc kubenswrapper[4965]: I0318 11:58:18.514969 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:18 crc kubenswrapper[4965]: I0318 11:58:18.548023 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:18 crc kubenswrapper[4965]: I0318 11:58:18.548979 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-mh2cx" podStartSLOduration=42.548959992 podStartE2EDuration="42.548959992s" podCreationTimestamp="2026-03-18 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:18.548768726 +0000 UTC m=+103.534956225" watchObservedRunningTime="2026-03-18 11:58:18.548959992 +0000 UTC m=+103.535147481" Mar 18 11:58:18 crc kubenswrapper[4965]: I0318 11:58:18.553052 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:18 crc kubenswrapper[4965]: I0318 11:58:18.595376 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-ts942" podStartSLOduration=42.59535511 podStartE2EDuration="42.59535511s" podCreationTimestamp="2026-03-18 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:18.59535685 +0000 UTC m=+103.581544349" watchObservedRunningTime="2026-03-18 11:58:18.59535511 +0000 UTC m=+103.581542589" Mar 18 11:58:19 crc kubenswrapper[4965]: I0318 11:58:19.020753 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9jx9z" Mar 18 11:58:19 crc kubenswrapper[4965]: I0318 11:58:19.021069 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 11:58:19 crc kubenswrapper[4965]: I0318 11:58:19.021035 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 11:58:19 crc kubenswrapper[4965]: E0318 11:58:19.021186 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9jx9z" podUID="1b676c03-201d-403c-8082-84451760c106" Mar 18 11:58:19 crc kubenswrapper[4965]: E0318 11:58:19.021238 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 11:58:19 crc kubenswrapper[4965]: I0318 11:58:19.021271 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 11:58:19 crc kubenswrapper[4965]: E0318 11:58:19.021328 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 11:58:19 crc kubenswrapper[4965]: E0318 11:58:19.021381 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 11:58:19 crc kubenswrapper[4965]: I0318 11:58:19.465449 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b676c03-201d-403c-8082-84451760c106-metrics-certs\") pod \"network-metrics-daemon-9jx9z\" (UID: \"1b676c03-201d-403c-8082-84451760c106\") " pod="openshift-multus/network-metrics-daemon-9jx9z" Mar 18 11:58:19 crc kubenswrapper[4965]: E0318 11:58:19.465730 4965 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 11:58:19 crc kubenswrapper[4965]: E0318 11:58:19.465821 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b676c03-201d-403c-8082-84451760c106-metrics-certs podName:1b676c03-201d-403c-8082-84451760c106 nodeName:}" failed. No retries permitted until 2026-03-18 11:58:27.465795674 +0000 UTC m=+112.451983183 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b676c03-201d-403c-8082-84451760c106-metrics-certs") pod "network-metrics-daemon-9jx9z" (UID: "1b676c03-201d-403c-8082-84451760c106") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 11:58:20 crc kubenswrapper[4965]: I0318 11:58:20.294871 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9jx9z"] Mar 18 11:58:20 crc kubenswrapper[4965]: I0318 11:58:20.294997 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9jx9z" Mar 18 11:58:20 crc kubenswrapper[4965]: E0318 11:58:20.295133 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9jx9z" podUID="1b676c03-201d-403c-8082-84451760c106" Mar 18 11:58:21 crc kubenswrapper[4965]: I0318 11:58:21.019991 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 11:58:21 crc kubenswrapper[4965]: E0318 11:58:21.020116 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 11:58:21 crc kubenswrapper[4965]: I0318 11:58:21.020521 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 11:58:21 crc kubenswrapper[4965]: E0318 11:58:21.020570 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 11:58:21 crc kubenswrapper[4965]: I0318 11:58:21.020622 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 11:58:21 crc kubenswrapper[4965]: E0318 11:58:21.020699 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.020157 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9jx9z" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.021221 4965 scope.go:117] "RemoveContainer" containerID="733d0d5c89687c2a3f5a4c5bed039b30b31430fe8959663bb82b04e5c4b6eb76" Mar 18 11:58:22 crc kubenswrapper[4965]: E0318 11:58:22.020743 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9jx9z" podUID="1b676c03-201d-403c-8082-84451760c106" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.524045 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.527503 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3c1605c2a78731623f65967e7b4af0bc71ba4b55c952a45bd2b093a59ac425a8"} Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.528114 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.696051 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:22 crc kubenswrapper[4965]: E0318 11:58:22.696435 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:38.696417198 +0000 UTC m=+123.682604677 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.798207 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.798286 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.798397 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.798463 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 11:58:22 crc kubenswrapper[4965]: E0318 11:58:22.798594 4965 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 11:58:22 crc kubenswrapper[4965]: E0318 11:58:22.798747 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 11:58:38.798701686 +0000 UTC m=+123.784889235 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 11:58:22 crc kubenswrapper[4965]: E0318 11:58:22.799406 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 11:58:22 crc kubenswrapper[4965]: E0318 11:58:22.799443 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 11:58:22 crc kubenswrapper[4965]: E0318 11:58:22.799465 4965 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 11:58:22 crc kubenswrapper[4965]: E0318 11:58:22.799530 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 11:58:38.799506098 +0000 UTC m=+123.785693617 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 11:58:22 crc kubenswrapper[4965]: E0318 11:58:22.799635 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 11:58:22 crc kubenswrapper[4965]: E0318 11:58:22.799697 4965 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 11:58:22 crc kubenswrapper[4965]: E0318 11:58:22.799717 4965 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 11:58:22 crc kubenswrapper[4965]: E0318 11:58:22.799778 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 11:58:38.799760255 +0000 UTC m=+123.785947764 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 11:58:22 crc kubenswrapper[4965]: E0318 11:58:22.799875 4965 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 11:58:22 crc kubenswrapper[4965]: E0318 11:58:22.799929 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 11:58:38.799912019 +0000 UTC m=+123.786099538 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.853111 4965 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.853622 4965 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.910547 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=14.910521687 podStartE2EDuration="14.910521687s" podCreationTimestamp="2026-03-18 11:58:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:22.550324222 +0000 UTC m=+107.536511721" watchObservedRunningTime="2026-03-18 11:58:22.910521687 +0000 UTC m=+107.896709186" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.913825 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ntz44"] Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.915051 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-pj8km"] Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.915561 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-qtnkp"] Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.922291 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-qtnkp" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.922815 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-pj8km" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.922876 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-ntz44" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.933862 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/00cf43a5-c3a1-4a4a-a49c-3707bb515d59-trusted-ca\") pod \"console-operator-58897d9998-qtnkp\" (UID: \"00cf43a5-c3a1-4a4a-a49c-3707bb515d59\") " pod="openshift-console-operator/console-operator-58897d9998-qtnkp" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.933902 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm4nn\" (UniqueName: \"kubernetes.io/projected/00cf43a5-c3a1-4a4a-a49c-3707bb515d59-kube-api-access-xm4nn\") pod \"console-operator-58897d9998-qtnkp\" (UID: \"00cf43a5-c3a1-4a4a-a49c-3707bb515d59\") " pod="openshift-console-operator/console-operator-58897d9998-qtnkp" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.933925 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00cf43a5-c3a1-4a4a-a49c-3707bb515d59-config\") pod \"console-operator-58897d9998-qtnkp\" (UID: \"00cf43a5-c3a1-4a4a-a49c-3707bb515d59\") " pod="openshift-console-operator/console-operator-58897d9998-qtnkp" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.933952 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92wjp\" (UniqueName: \"kubernetes.io/projected/36b1bb7c-dba1-491c-84ba-820adbe96f3a-kube-api-access-92wjp\") pod \"dns-operator-744455d44c-ntz44\" (UID: \"36b1bb7c-dba1-491c-84ba-820adbe96f3a\") " pod="openshift-dns-operator/dns-operator-744455d44c-ntz44" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.933997 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bf0cad60-78bb-4325-aaaa-ee2636410fcb-oauth-serving-cert\") pod \"console-f9d7485db-pj8km\" (UID: \"bf0cad60-78bb-4325-aaaa-ee2636410fcb\") " pod="openshift-console/console-f9d7485db-pj8km" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.934014 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bf0cad60-78bb-4325-aaaa-ee2636410fcb-console-config\") pod \"console-f9d7485db-pj8km\" (UID: \"bf0cad60-78bb-4325-aaaa-ee2636410fcb\") " pod="openshift-console/console-f9d7485db-pj8km" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.934056 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfzv8\" (UniqueName: \"kubernetes.io/projected/bf0cad60-78bb-4325-aaaa-ee2636410fcb-kube-api-access-gfzv8\") pod \"console-f9d7485db-pj8km\" (UID: \"bf0cad60-78bb-4325-aaaa-ee2636410fcb\") " pod="openshift-console/console-f9d7485db-pj8km" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.934070 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00cf43a5-c3a1-4a4a-a49c-3707bb515d59-serving-cert\") pod \"console-operator-58897d9998-qtnkp\" (UID: \"00cf43a5-c3a1-4a4a-a49c-3707bb515d59\") " pod="openshift-console-operator/console-operator-58897d9998-qtnkp" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.934137 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bf0cad60-78bb-4325-aaaa-ee2636410fcb-service-ca\") pod \"console-f9d7485db-pj8km\" (UID: \"bf0cad60-78bb-4325-aaaa-ee2636410fcb\") " pod="openshift-console/console-f9d7485db-pj8km" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.934152 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bf0cad60-78bb-4325-aaaa-ee2636410fcb-console-oauth-config\") pod \"console-f9d7485db-pj8km\" (UID: \"bf0cad60-78bb-4325-aaaa-ee2636410fcb\") " pod="openshift-console/console-f9d7485db-pj8km" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.934176 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/36b1bb7c-dba1-491c-84ba-820adbe96f3a-metrics-tls\") pod \"dns-operator-744455d44c-ntz44\" (UID: \"36b1bb7c-dba1-491c-84ba-820adbe96f3a\") " pod="openshift-dns-operator/dns-operator-744455d44c-ntz44" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.934193 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf0cad60-78bb-4325-aaaa-ee2636410fcb-console-serving-cert\") pod \"console-f9d7485db-pj8km\" (UID: \"bf0cad60-78bb-4325-aaaa-ee2636410fcb\") " pod="openshift-console/console-f9d7485db-pj8km" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.934206 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf0cad60-78bb-4325-aaaa-ee2636410fcb-trusted-ca-bundle\") pod \"console-f9d7485db-pj8km\" (UID: \"bf0cad60-78bb-4325-aaaa-ee2636410fcb\") " pod="openshift-console/console-f9d7485db-pj8km" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.934328 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-qdbck"] Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.934734 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-6w7zg"] Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.934984 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-6w7zg" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.934998 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-q9zxt"] Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.935358 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qdbck" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.935446 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-q9zxt" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.936786 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.936916 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fs56m"] Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.937124 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.937610 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wg6wd"] Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.938072 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wg6wd" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.938483 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fs56m" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.938102 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.938148 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.938183 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.940287 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sw5p7"] Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.938216 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.938304 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.940777 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-sw5p7" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.940696 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s2wpl"] Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.938337 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.941428 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s2wpl" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.938385 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.941529 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-mmzsn"] Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.943231 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-k68bp"] Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.943676 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-k68bp" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.943889 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-mmzsn" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.945594 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.945986 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.946333 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.946454 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.946748 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.948521 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.948715 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.951070 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.952234 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.952504 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.952624 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.952768 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.952525 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.952990 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.952572 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.953233 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.952945 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.953192 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.953420 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.953879 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.954058 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.954099 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.955983 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.956312 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wbdxj"] Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.956817 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.956828 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nqqmj"] Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.956991 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.957013 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wbdxj" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.957527 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nqqmj" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.958750 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-zptvt"] Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.959458 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-zptvt" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.960383 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.960420 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.960450 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.960798 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.960951 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.961196 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-rwvcf"] Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.961583 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.961608 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.961825 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rwvcf" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.962000 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-sk66r"] Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.961857 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.961859 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.961881 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.961918 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.961926 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.961943 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.961990 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.962029 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.962079 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.963280 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.963999 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.964086 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.964328 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.964358 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.964441 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.964521 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.964673 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.964738 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.964918 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.979958 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.980975 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.981246 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.981484 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.981715 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.982070 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.982437 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.983149 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.983409 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.983565 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.984195 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5x9mq"] Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.984448 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.984643 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.984816 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.984998 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.993158 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.995238 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.995439 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.995518 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.995579 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.995688 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.995708 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.996012 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.996059 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.996195 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.997302 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-r8xpr"] Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.997919 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wszrt"] Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.999035 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-llx6v"] Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.998048 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5x9mq" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.999319 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563905-l6rnm"] Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.999599 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wszrt" Mar 18 11:58:22 crc kubenswrapper[4965]: I0318 11:58:22.998437 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8xpr" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.000026 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563905-l6rnm" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.000043 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-llx6v" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.002746 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.002871 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.002891 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.003156 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.003627 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.003792 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.004089 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.004324 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.005485 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.005871 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.008212 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.008369 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.009452 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.009506 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.009682 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.009710 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.010496 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dddvm"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.011524 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dddvm" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.013183 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4sskq"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.013762 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-4sskq" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.016314 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sflt7"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.016816 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zmt4c"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.017155 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-q4h66"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.017330 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sflt7" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.017650 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.018349 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-q4h66" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.019200 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.019265 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.019364 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-xhsv5"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.019632 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.019731 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-b4nfn"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.019954 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-xhsv5" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.020736 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8pdbx"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.021320 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8pdbx" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.024444 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-fgg7z"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.026252 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-fgg7z" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.027313 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.029794 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.029809 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.030786 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b4nfn" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.031811 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.037475 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7a9e071f-3977-4a85-9b66-30c59efd7d3a-audit-policies\") pod \"apiserver-7bbb656c7d-qdbck\" (UID: \"7a9e071f-3977-4a85-9b66-30c59efd7d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qdbck" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.037526 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c56d894-8d33-4682-b8ea-4d99fc3bb1b7-client-ca\") pod \"route-controller-manager-6576b87f9c-5x9mq\" (UID: \"2c56d894-8d33-4682-b8ea-4d99fc3bb1b7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5x9mq" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.037556 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f7011ee6-49c9-4b89-ae23-545bac0f68d0-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-wszrt\" (UID: \"f7011ee6-49c9-4b89-ae23-545bac0f68d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wszrt" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.037590 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/00cf43a5-c3a1-4a4a-a49c-3707bb515d59-trusted-ca\") pod \"console-operator-58897d9998-qtnkp\" (UID: \"00cf43a5-c3a1-4a4a-a49c-3707bb515d59\") " pod="openshift-console-operator/console-operator-58897d9998-qtnkp" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.037617 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2aaf7d7a-c623-4198-b68e-7efa895cb96f-config-volume\") pod \"collect-profiles-29563905-l6rnm\" (UID: \"2aaf7d7a-c623-4198-b68e-7efa895cb96f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563905-l6rnm" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.037640 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7a9e071f-3977-4a85-9b66-30c59efd7d3a-audit-dir\") pod \"apiserver-7bbb656c7d-qdbck\" (UID: \"7a9e071f-3977-4a85-9b66-30c59efd7d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qdbck" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.037717 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9b045370-3446-4fe4-b7c4-d0b12ddd06f4-etcd-client\") pod \"etcd-operator-b45778765-6w7zg\" (UID: \"9b045370-3446-4fe4-b7c4-d0b12ddd06f4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6w7zg" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.037742 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d252be2-ace2-40c4-a4e3-3824d9bffff4-serving-cert\") pod \"authentication-operator-69f744f599-k68bp\" (UID: \"3d252be2-ace2-40c4-a4e3-3824d9bffff4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k68bp" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.037805 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm4nn\" (UniqueName: \"kubernetes.io/projected/00cf43a5-c3a1-4a4a-a49c-3707bb515d59-kube-api-access-xm4nn\") pod \"console-operator-58897d9998-qtnkp\" (UID: \"00cf43a5-c3a1-4a4a-a49c-3707bb515d59\") " pod="openshift-console-operator/console-operator-58897d9998-qtnkp" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.037833 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b045370-3446-4fe4-b7c4-d0b12ddd06f4-config\") pod \"etcd-operator-b45778765-6w7zg\" (UID: \"9b045370-3446-4fe4-b7c4-d0b12ddd06f4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6w7zg" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.037855 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cfd1d38-a35c-47ab-962f-4403875e5a19-config\") pod \"apiserver-76f77b778f-mmzsn\" (UID: \"6cfd1d38-a35c-47ab-962f-4403875e5a19\") " pod="openshift-apiserver/apiserver-76f77b778f-mmzsn" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.037879 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6cfd1d38-a35c-47ab-962f-4403875e5a19-audit-dir\") pod \"apiserver-76f77b778f-mmzsn\" (UID: \"6cfd1d38-a35c-47ab-962f-4403875e5a19\") " pod="openshift-apiserver/apiserver-76f77b778f-mmzsn" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.037901 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e7d665e-a211-432c-8884-74096504ab5c-config\") pod \"machine-api-operator-5694c8668f-zptvt\" (UID: \"9e7d665e-a211-432c-8884-74096504ab5c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zptvt" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.037922 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/29015bb8-e604-425d-a88b-db3ec9e10096-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-sw5p7\" (UID: \"29015bb8-e604-425d-a88b-db3ec9e10096\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sw5p7" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.037946 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00cf43a5-c3a1-4a4a-a49c-3707bb515d59-config\") pod \"console-operator-58897d9998-qtnkp\" (UID: \"00cf43a5-c3a1-4a4a-a49c-3707bb515d59\") " pod="openshift-console-operator/console-operator-58897d9998-qtnkp" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.037970 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-sk66r\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.037998 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6cfd1d38-a35c-47ab-962f-4403875e5a19-encryption-config\") pod \"apiserver-76f77b778f-mmzsn\" (UID: \"6cfd1d38-a35c-47ab-962f-4403875e5a19\") " pod="openshift-apiserver/apiserver-76f77b778f-mmzsn" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.038023 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x98tf\" (UniqueName: \"kubernetes.io/projected/29015bb8-e604-425d-a88b-db3ec9e10096-kube-api-access-x98tf\") pod \"controller-manager-879f6c89f-sw5p7\" (UID: \"29015bb8-e604-425d-a88b-db3ec9e10096\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sw5p7" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.038077 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a9a21f3-0936-44a9-b1f0-e82345703f0a-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-nqqmj\" (UID: \"4a9a21f3-0936-44a9-b1f0-e82345703f0a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nqqmj" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.038099 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ea2b4f76-5f61-410a-bb32-c1185e630c7a-default-certificate\") pod \"router-default-5444994796-xhsv5\" (UID: \"ea2b4f76-5f61-410a-bb32-c1185e630c7a\") " pod="openshift-ingress/router-default-5444994796-xhsv5" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.038133 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92wjp\" (UniqueName: \"kubernetes.io/projected/36b1bb7c-dba1-491c-84ba-820adbe96f3a-kube-api-access-92wjp\") pod \"dns-operator-744455d44c-ntz44\" (UID: \"36b1bb7c-dba1-491c-84ba-820adbe96f3a\") " pod="openshift-dns-operator/dns-operator-744455d44c-ntz44" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.038155 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bf0cad60-78bb-4325-aaaa-ee2636410fcb-oauth-serving-cert\") pod \"console-f9d7485db-pj8km\" (UID: \"bf0cad60-78bb-4325-aaaa-ee2636410fcb\") " pod="openshift-console/console-f9d7485db-pj8km" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.038175 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/89a368d8-7c06-47c3-9a5f-0c5cc5a5cd47-images\") pod \"machine-config-operator-74547568cd-b4nfn\" (UID: \"89a368d8-7c06-47c3-9a5f-0c5cc5a5cd47\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b4nfn" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.044375 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vfp9m"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.045399 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-jvz5t"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.046178 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bksfk"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.048257 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bksfk" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.049219 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vfp9m" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.049575 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-jvz5t" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.039300 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8d01f52f-de93-4fa8-80c2-88b21f4e6400-apiservice-cert\") pod \"packageserver-d55dfcdfc-dddvm\" (UID: \"8d01f52f-de93-4fa8-80c2-88b21f4e6400\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dddvm" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.056183 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00cf43a5-c3a1-4a4a-a49c-3707bb515d59-config\") pod \"console-operator-58897d9998-qtnkp\" (UID: \"00cf43a5-c3a1-4a4a-a49c-3707bb515d59\") " pod="openshift-console-operator/console-operator-58897d9998-qtnkp" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.056289 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnzl5\" (UniqueName: \"kubernetes.io/projected/2c56d894-8d33-4682-b8ea-4d99fc3bb1b7-kube-api-access-cnzl5\") pod \"route-controller-manager-6576b87f9c-5x9mq\" (UID: \"2c56d894-8d33-4682-b8ea-4d99fc3bb1b7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5x9mq" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.056336 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bf0cad60-78bb-4325-aaaa-ee2636410fcb-console-config\") pod \"console-f9d7485db-pj8km\" (UID: \"bf0cad60-78bb-4325-aaaa-ee2636410fcb\") " pod="openshift-console/console-f9d7485db-pj8km" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.056366 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9e7d665e-a211-432c-8884-74096504ab5c-images\") pod \"machine-api-operator-5694c8668f-zptvt\" (UID: \"9e7d665e-a211-432c-8884-74096504ab5c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zptvt" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.056395 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bb6cf60-3071-4f09-b9aa-e5d05b03804e-config\") pod \"openshift-apiserver-operator-796bbdcf4f-wbdxj\" (UID: \"9bb6cf60-3071-4f09-b9aa-e5d05b03804e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wbdxj" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.056423 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a9e071f-3977-4a85-9b66-30c59efd7d3a-serving-cert\") pod \"apiserver-7bbb656c7d-qdbck\" (UID: \"7a9e071f-3977-4a85-9b66-30c59efd7d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qdbck" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.056453 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a9e071f-3977-4a85-9b66-30c59efd7d3a-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-qdbck\" (UID: \"7a9e071f-3977-4a85-9b66-30c59efd7d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qdbck" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.056504 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b0c9543d-7fc2-45cb-a925-f3c030a3c8cf-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-s2wpl\" (UID: \"b0c9543d-7fc2-45cb-a925-f3c030a3c8cf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s2wpl" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.056548 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29015bb8-e604-425d-a88b-db3ec9e10096-serving-cert\") pod \"controller-manager-879f6c89f-sw5p7\" (UID: \"29015bb8-e604-425d-a88b-db3ec9e10096\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sw5p7" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.056583 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d252be2-ace2-40c4-a4e3-3824d9bffff4-config\") pod \"authentication-operator-69f744f599-k68bp\" (UID: \"3d252be2-ace2-40c4-a4e3-3824d9bffff4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k68bp" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.056633 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-sk66r\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.056683 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-sk66r\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.056716 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a9a21f3-0936-44a9-b1f0-e82345703f0a-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-nqqmj\" (UID: \"4a9a21f3-0936-44a9-b1f0-e82345703f0a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nqqmj" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.056745 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b1dc0d2-30e8-46f0-a933-cd321c32590b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sflt7\" (UID: \"5b1dc0d2-30e8-46f0-a933-cd321c32590b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sflt7" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.056770 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6cfd1d38-a35c-47ab-962f-4403875e5a19-etcd-serving-ca\") pod \"apiserver-76f77b778f-mmzsn\" (UID: \"6cfd1d38-a35c-47ab-962f-4403875e5a19\") " pod="openshift-apiserver/apiserver-76f77b778f-mmzsn" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.056797 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/89a368d8-7c06-47c3-9a5f-0c5cc5a5cd47-auth-proxy-config\") pod \"machine-config-operator-74547568cd-b4nfn\" (UID: \"89a368d8-7c06-47c3-9a5f-0c5cc5a5cd47\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b4nfn" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.056841 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae68bf3c-11de-49ed-8fc2-a669cb8a04fa-config\") pod \"machine-approver-56656f9798-rwvcf\" (UID: \"ae68bf3c-11de-49ed-8fc2-a669cb8a04fa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rwvcf" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.056873 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-audit-dir\") pod \"oauth-openshift-558db77b4-sk66r\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.056901 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5cjt\" (UniqueName: \"kubernetes.io/projected/2aaf7d7a-c623-4198-b68e-7efa895cb96f-kube-api-access-b5cjt\") pod \"collect-profiles-29563905-l6rnm\" (UID: \"2aaf7d7a-c623-4198-b68e-7efa895cb96f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563905-l6rnm" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.056931 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z95m\" (UniqueName: \"kubernetes.io/projected/6cfd1d38-a35c-47ab-962f-4403875e5a19-kube-api-access-7z95m\") pod \"apiserver-76f77b778f-mmzsn\" (UID: \"6cfd1d38-a35c-47ab-962f-4403875e5a19\") " pod="openshift-apiserver/apiserver-76f77b778f-mmzsn" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.056953 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/29015bb8-e604-425d-a88b-db3ec9e10096-client-ca\") pod \"controller-manager-879f6c89f-sw5p7\" (UID: \"29015bb8-e604-425d-a88b-db3ec9e10096\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sw5p7" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.057006 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-kwbct"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.057111 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-sk66r\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.065608 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bf0cad60-78bb-4325-aaaa-ee2636410fcb-oauth-serving-cert\") pod \"console-f9d7485db-pj8km\" (UID: \"bf0cad60-78bb-4325-aaaa-ee2636410fcb\") " pod="openshift-console/console-f9d7485db-pj8km" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.066072 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea2b4f76-5f61-410a-bb32-c1185e630c7a-metrics-certs\") pod \"router-default-5444994796-xhsv5\" (UID: \"ea2b4f76-5f61-410a-bb32-c1185e630c7a\") " pod="openshift-ingress/router-default-5444994796-xhsv5" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.066137 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbblb\" (UniqueName: \"kubernetes.io/projected/ea2b4f76-5f61-410a-bb32-c1185e630c7a-kube-api-access-jbblb\") pod \"router-default-5444994796-xhsv5\" (UID: \"ea2b4f76-5f61-410a-bb32-c1185e630c7a\") " pod="openshift-ingress/router-default-5444994796-xhsv5" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.066165 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b1dc0d2-30e8-46f0-a933-cd321c32590b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sflt7\" (UID: \"5b1dc0d2-30e8-46f0-a933-cd321c32590b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sflt7" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.066193 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6cfd1d38-a35c-47ab-962f-4403875e5a19-image-import-ca\") pod \"apiserver-76f77b778f-mmzsn\" (UID: \"6cfd1d38-a35c-47ab-962f-4403875e5a19\") " pod="openshift-apiserver/apiserver-76f77b778f-mmzsn" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.066215 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b27563b-d715-4dfa-84b6-5d0f1d90e4b5-serving-cert\") pod \"openshift-config-operator-7777fb866f-fs56m\" (UID: \"8b27563b-d715-4dfa-84b6-5d0f1d90e4b5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fs56m" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.066236 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48wdr\" (UniqueName: \"kubernetes.io/projected/ae68bf3c-11de-49ed-8fc2-a669cb8a04fa-kube-api-access-48wdr\") pod \"machine-approver-56656f9798-rwvcf\" (UID: \"ae68bf3c-11de-49ed-8fc2-a669cb8a04fa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rwvcf" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.066259 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jddcd\" (UniqueName: \"kubernetes.io/projected/9b045370-3446-4fe4-b7c4-d0b12ddd06f4-kube-api-access-jddcd\") pod \"etcd-operator-b45778765-6w7zg\" (UID: \"9b045370-3446-4fe4-b7c4-d0b12ddd06f4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6w7zg" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.066288 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-sk66r\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.066312 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v994\" (UniqueName: \"kubernetes.io/projected/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-kube-api-access-4v994\") pod \"oauth-openshift-558db77b4-sk66r\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.066333 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9b045370-3446-4fe4-b7c4-d0b12ddd06f4-etcd-ca\") pod \"etcd-operator-b45778765-6w7zg\" (UID: \"9b045370-3446-4fe4-b7c4-d0b12ddd06f4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6w7zg" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.066355 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xsqf\" (UniqueName: \"kubernetes.io/projected/7a9e071f-3977-4a85-9b66-30c59efd7d3a-kube-api-access-8xsqf\") pod \"apiserver-7bbb656c7d-qdbck\" (UID: \"7a9e071f-3977-4a85-9b66-30c59efd7d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qdbck" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.066386 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfzv8\" (UniqueName: \"kubernetes.io/projected/bf0cad60-78bb-4325-aaaa-ee2636410fcb-kube-api-access-gfzv8\") pod \"console-f9d7485db-pj8km\" (UID: \"bf0cad60-78bb-4325-aaaa-ee2636410fcb\") " pod="openshift-console/console-f9d7485db-pj8km" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.066410 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-sk66r\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.066429 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cfd1d38-a35c-47ab-962f-4403875e5a19-serving-cert\") pod \"apiserver-76f77b778f-mmzsn\" (UID: \"6cfd1d38-a35c-47ab-962f-4403875e5a19\") " pod="openshift-apiserver/apiserver-76f77b778f-mmzsn" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.066460 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8fnf\" (UniqueName: \"kubernetes.io/projected/9bb6cf60-3071-4f09-b9aa-e5d05b03804e-kube-api-access-b8fnf\") pod \"openshift-apiserver-operator-796bbdcf4f-wbdxj\" (UID: \"9bb6cf60-3071-4f09-b9aa-e5d05b03804e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wbdxj" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.066483 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7011ee6-49c9-4b89-ae23-545bac0f68d0-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-wszrt\" (UID: \"f7011ee6-49c9-4b89-ae23-545bac0f68d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wszrt" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.066506 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c3e59a9f-c0ea-4a0e-a39a-f332e3eba1a4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-fgg7z\" (UID: \"c3e59a9f-c0ea-4a0e-a39a-f332e3eba1a4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fgg7z" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.066527 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b1dc0d2-30e8-46f0-a933-cd321c32590b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sflt7\" (UID: \"5b1dc0d2-30e8-46f0-a933-cd321c32590b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sflt7" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.066552 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00cf43a5-c3a1-4a4a-a49c-3707bb515d59-serving-cert\") pod \"console-operator-58897d9998-qtnkp\" (UID: \"00cf43a5-c3a1-4a4a-a49c-3707bb515d59\") " pod="openshift-console-operator/console-operator-58897d9998-qtnkp" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.066579 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bf0cad60-78bb-4325-aaaa-ee2636410fcb-console-oauth-config\") pod \"console-f9d7485db-pj8km\" (UID: \"bf0cad60-78bb-4325-aaaa-ee2636410fcb\") " pod="openshift-console/console-f9d7485db-pj8km" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.066602 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bf0cad60-78bb-4325-aaaa-ee2636410fcb-service-ca\") pod \"console-f9d7485db-pj8km\" (UID: \"bf0cad60-78bb-4325-aaaa-ee2636410fcb\") " pod="openshift-console/console-f9d7485db-pj8km" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.066627 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-sk66r\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.066648 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d252be2-ace2-40c4-a4e3-3824d9bffff4-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-k68bp\" (UID: \"3d252be2-ace2-40c4-a4e3-3824d9bffff4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k68bp" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.066689 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bb6cf60-3071-4f09-b9aa-e5d05b03804e-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-wbdxj\" (UID: \"9bb6cf60-3071-4f09-b9aa-e5d05b03804e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wbdxj" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.066712 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5t4x\" (UniqueName: \"kubernetes.io/projected/8b27563b-d715-4dfa-84b6-5d0f1d90e4b5-kube-api-access-n5t4x\") pod \"openshift-config-operator-7777fb866f-fs56m\" (UID: \"8b27563b-d715-4dfa-84b6-5d0f1d90e4b5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fs56m" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.066734 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea2b4f76-5f61-410a-bb32-c1185e630c7a-service-ca-bundle\") pod \"router-default-5444994796-xhsv5\" (UID: \"ea2b4f76-5f61-410a-bb32-c1185e630c7a\") " pod="openshift-ingress/router-default-5444994796-xhsv5" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.066756 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-sk66r\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.066800 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/89a368d8-7c06-47c3-9a5f-0c5cc5a5cd47-proxy-tls\") pod \"machine-config-operator-74547568cd-b4nfn\" (UID: \"89a368d8-7c06-47c3-9a5f-0c5cc5a5cd47\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b4nfn" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.066831 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/36b1bb7c-dba1-491c-84ba-820adbe96f3a-metrics-tls\") pod \"dns-operator-744455d44c-ntz44\" (UID: \"36b1bb7c-dba1-491c-84ba-820adbe96f3a\") " pod="openshift-dns-operator/dns-operator-744455d44c-ntz44" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.066860 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf0cad60-78bb-4325-aaaa-ee2636410fcb-console-serving-cert\") pod \"console-f9d7485db-pj8km\" (UID: \"bf0cad60-78bb-4325-aaaa-ee2636410fcb\") " pod="openshift-console/console-f9d7485db-pj8km" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.066885 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d252be2-ace2-40c4-a4e3-3824d9bffff4-service-ca-bundle\") pod \"authentication-operator-69f744f599-k68bp\" (UID: \"3d252be2-ace2-40c4-a4e3-3824d9bffff4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k68bp" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.066911 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29015bb8-e604-425d-a88b-db3ec9e10096-config\") pod \"controller-manager-879f6c89f-sw5p7\" (UID: \"29015bb8-e604-425d-a88b-db3ec9e10096\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sw5p7" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.066938 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf0cad60-78bb-4325-aaaa-ee2636410fcb-trusted-ca-bundle\") pod \"console-f9d7485db-pj8km\" (UID: \"bf0cad60-78bb-4325-aaaa-ee2636410fcb\") " pod="openshift-console/console-f9d7485db-pj8km" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.066968 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6cfd1d38-a35c-47ab-962f-4403875e5a19-etcd-client\") pod \"apiserver-76f77b778f-mmzsn\" (UID: \"6cfd1d38-a35c-47ab-962f-4403875e5a19\") " pod="openshift-apiserver/apiserver-76f77b778f-mmzsn" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.066996 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2aaf7d7a-c623-4198-b68e-7efa895cb96f-secret-volume\") pod \"collect-profiles-29563905-l6rnm\" (UID: \"2aaf7d7a-c623-4198-b68e-7efa895cb96f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563905-l6rnm" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.067018 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55v4n\" (UniqueName: \"kubernetes.io/projected/3d252be2-ace2-40c4-a4e3-3824d9bffff4-kube-api-access-55v4n\") pod \"authentication-operator-69f744f599-k68bp\" (UID: \"3d252be2-ace2-40c4-a4e3-3824d9bffff4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k68bp" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.067042 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g47l\" (UniqueName: \"kubernetes.io/projected/b0c9543d-7fc2-45cb-a925-f3c030a3c8cf-kube-api-access-2g47l\") pod \"cluster-samples-operator-665b6dd947-s2wpl\" (UID: \"b0c9543d-7fc2-45cb-a925-f3c030a3c8cf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s2wpl" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.067055 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/00cf43a5-c3a1-4a4a-a49c-3707bb515d59-trusted-ca\") pod \"console-operator-58897d9998-qtnkp\" (UID: \"00cf43a5-c3a1-4a4a-a49c-3707bb515d59\") " pod="openshift-console-operator/console-operator-58897d9998-qtnkp" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.067070 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-audit-policies\") pod \"oauth-openshift-558db77b4-sk66r\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.067101 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9b045370-3446-4fe4-b7c4-d0b12ddd06f4-etcd-service-ca\") pod \"etcd-operator-b45778765-6w7zg\" (UID: \"9b045370-3446-4fe4-b7c4-d0b12ddd06f4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6w7zg" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.067126 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/081ced5c-b889-43c7-a7a5-bb6561452f4a-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-llx6v\" (UID: \"081ced5c-b889-43c7-a7a5-bb6561452f4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-llx6v" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.067150 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm7tl\" (UniqueName: \"kubernetes.io/projected/081ced5c-b889-43c7-a7a5-bb6561452f4a-kube-api-access-wm7tl\") pod \"kube-storage-version-migrator-operator-b67b599dd-llx6v\" (UID: \"081ced5c-b889-43c7-a7a5-bb6561452f4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-llx6v" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.067179 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9e7d665e-a211-432c-8884-74096504ab5c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-zptvt\" (UID: \"9e7d665e-a211-432c-8884-74096504ab5c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zptvt" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.067205 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/081ced5c-b889-43c7-a7a5-bb6561452f4a-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-llx6v\" (UID: \"081ced5c-b889-43c7-a7a5-bb6561452f4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-llx6v" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.067226 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c56d894-8d33-4682-b8ea-4d99fc3bb1b7-serving-cert\") pod \"route-controller-manager-6576b87f9c-5x9mq\" (UID: \"2c56d894-8d33-4682-b8ea-4d99fc3bb1b7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5x9mq" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.067226 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-2dm7f"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.067250 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7a9e071f-3977-4a85-9b66-30c59efd7d3a-etcd-client\") pod \"apiserver-7bbb656c7d-qdbck\" (UID: \"7a9e071f-3977-4a85-9b66-30c59efd7d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qdbck" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.067274 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7a9e071f-3977-4a85-9b66-30c59efd7d3a-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-qdbck\" (UID: \"7a9e071f-3977-4a85-9b66-30c59efd7d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qdbck" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.067341 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7011ee6-49c9-4b89-ae23-545bac0f68d0-config\") pod \"kube-apiserver-operator-766d6c64bb-wszrt\" (UID: \"f7011ee6-49c9-4b89-ae23-545bac0f68d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wszrt" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.067370 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnzrx\" (UniqueName: \"kubernetes.io/projected/c3e59a9f-c0ea-4a0e-a39a-f332e3eba1a4-kube-api-access-mnzrx\") pod \"multus-admission-controller-857f4d67dd-fgg7z\" (UID: \"c3e59a9f-c0ea-4a0e-a39a-f332e3eba1a4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fgg7z" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.067398 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8d01f52f-de93-4fa8-80c2-88b21f4e6400-webhook-cert\") pod \"packageserver-d55dfcdfc-dddvm\" (UID: \"8d01f52f-de93-4fa8-80c2-88b21f4e6400\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dddvm" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.067419 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/8b27563b-d715-4dfa-84b6-5d0f1d90e4b5-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fs56m\" (UID: \"8b27563b-d715-4dfa-84b6-5d0f1d90e4b5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fs56m" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.067438 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ea2b4f76-5f61-410a-bb32-c1185e630c7a-stats-auth\") pod \"router-default-5444994796-xhsv5\" (UID: \"ea2b4f76-5f61-410a-bb32-c1185e630c7a\") " pod="openshift-ingress/router-default-5444994796-xhsv5" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.067461 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wkgp\" (UniqueName: \"kubernetes.io/projected/8d01f52f-de93-4fa8-80c2-88b21f4e6400-kube-api-access-7wkgp\") pod \"packageserver-d55dfcdfc-dddvm\" (UID: \"8d01f52f-de93-4fa8-80c2-88b21f4e6400\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dddvm" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.067483 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnqmf\" (UniqueName: \"kubernetes.io/projected/4a9a21f3-0936-44a9-b1f0-e82345703f0a-kube-api-access-wnqmf\") pod \"openshift-controller-manager-operator-756b6f6bc6-nqqmj\" (UID: \"4a9a21f3-0936-44a9-b1f0-e82345703f0a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nqqmj" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.067503 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ae68bf3c-11de-49ed-8fc2-a669cb8a04fa-auth-proxy-config\") pod \"machine-approver-56656f9798-rwvcf\" (UID: \"ae68bf3c-11de-49ed-8fc2-a669cb8a04fa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rwvcf" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.067563 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-sk66r\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.067813 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kwbct" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.067835 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nb2r4"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.067906 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2dm7f" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.068060 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bf0cad60-78bb-4325-aaaa-ee2636410fcb-console-config\") pod \"console-f9d7485db-pj8km\" (UID: \"bf0cad60-78bb-4325-aaaa-ee2636410fcb\") " pod="openshift-console/console-f9d7485db-pj8km" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.068891 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cv5z6"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.068951 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-sk66r\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.068996 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6cfd1d38-a35c-47ab-962f-4403875e5a19-node-pullsecrets\") pod \"apiserver-76f77b778f-mmzsn\" (UID: \"6cfd1d38-a35c-47ab-962f-4403875e5a19\") " pod="openshift-apiserver/apiserver-76f77b778f-mmzsn" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.069023 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6cfd1d38-a35c-47ab-962f-4403875e5a19-audit\") pod \"apiserver-76f77b778f-mmzsn\" (UID: \"6cfd1d38-a35c-47ab-962f-4403875e5a19\") " pod="openshift-apiserver/apiserver-76f77b778f-mmzsn" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.069083 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c56d894-8d33-4682-b8ea-4d99fc3bb1b7-config\") pod \"route-controller-manager-6576b87f9c-5x9mq\" (UID: \"2c56d894-8d33-4682-b8ea-4d99fc3bb1b7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5x9mq" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.069120 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/ae68bf3c-11de-49ed-8fc2-a669cb8a04fa-machine-approver-tls\") pod \"machine-approver-56656f9798-rwvcf\" (UID: \"ae68bf3c-11de-49ed-8fc2-a669cb8a04fa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rwvcf" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.069178 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-sk66r\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.069204 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67bv4\" (UniqueName: \"kubernetes.io/projected/89a368d8-7c06-47c3-9a5f-0c5cc5a5cd47-kube-api-access-67bv4\") pod \"machine-config-operator-74547568cd-b4nfn\" (UID: \"89a368d8-7c06-47c3-9a5f-0c5cc5a5cd47\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b4nfn" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.069226 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b045370-3446-4fe4-b7c4-d0b12ddd06f4-serving-cert\") pod \"etcd-operator-b45778765-6w7zg\" (UID: \"9b045370-3446-4fe4-b7c4-d0b12ddd06f4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6w7zg" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.069265 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2df2\" (UniqueName: \"kubernetes.io/projected/9e7d665e-a211-432c-8884-74096504ab5c-kube-api-access-h2df2\") pod \"machine-api-operator-5694c8668f-zptvt\" (UID: \"9e7d665e-a211-432c-8884-74096504ab5c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zptvt" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.069288 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cv5z6" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.069359 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nb2r4" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.069288 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/8d01f52f-de93-4fa8-80c2-88b21f4e6400-tmpfs\") pod \"packageserver-d55dfcdfc-dddvm\" (UID: \"8d01f52f-de93-4fa8-80c2-88b21f4e6400\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dddvm" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.069501 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7a9e071f-3977-4a85-9b66-30c59efd7d3a-encryption-config\") pod \"apiserver-7bbb656c7d-qdbck\" (UID: \"7a9e071f-3977-4a85-9b66-30c59efd7d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qdbck" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.069530 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bf0cad60-78bb-4325-aaaa-ee2636410fcb-service-ca\") pod \"console-f9d7485db-pj8km\" (UID: \"bf0cad60-78bb-4325-aaaa-ee2636410fcb\") " pod="openshift-console/console-f9d7485db-pj8km" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.069549 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6cfd1d38-a35c-47ab-962f-4403875e5a19-trusted-ca-bundle\") pod \"apiserver-76f77b778f-mmzsn\" (UID: \"6cfd1d38-a35c-47ab-962f-4403875e5a19\") " pod="openshift-apiserver/apiserver-76f77b778f-mmzsn" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.069882 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.070113 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ntz44"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.071586 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-pj8km"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.071925 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf0cad60-78bb-4325-aaaa-ee2636410fcb-trusted-ca-bundle\") pod \"console-f9d7485db-pj8km\" (UID: \"bf0cad60-78bb-4325-aaaa-ee2636410fcb\") " pod="openshift-console/console-f9d7485db-pj8km" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.072774 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-qtnkp"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.076941 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/36b1bb7c-dba1-491c-84ba-820adbe96f3a-metrics-tls\") pod \"dns-operator-744455d44c-ntz44\" (UID: \"36b1bb7c-dba1-491c-84ba-820adbe96f3a\") " pod="openshift-dns-operator/dns-operator-744455d44c-ntz44" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.076976 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf0cad60-78bb-4325-aaaa-ee2636410fcb-console-serving-cert\") pod \"console-f9d7485db-pj8km\" (UID: \"bf0cad60-78bb-4325-aaaa-ee2636410fcb\") " pod="openshift-console/console-f9d7485db-pj8km" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.082358 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bf0cad60-78bb-4325-aaaa-ee2636410fcb-console-oauth-config\") pod \"console-f9d7485db-pj8km\" (UID: \"bf0cad60-78bb-4325-aaaa-ee2636410fcb\") " pod="openshift-console/console-f9d7485db-pj8km" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.085321 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00cf43a5-c3a1-4a4a-a49c-3707bb515d59-serving-cert\") pod \"console-operator-58897d9998-qtnkp\" (UID: \"00cf43a5-c3a1-4a4a-a49c-3707bb515d59\") " pod="openshift-console-operator/console-operator-58897d9998-qtnkp" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.085589 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.087398 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.093891 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-5scj2"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.094572 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-5scj2" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.097133 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-w9kwc"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.097957 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-w9kwc" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.099240 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fs56m"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.100720 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-6w7zg"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.102286 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-qdbck"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.104139 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wg6wd"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.105593 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sw5p7"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.107706 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s2wpl"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.108175 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.113698 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-q9zxt"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.116722 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563905-l6rnm"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.116784 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wbdxj"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.118608 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vfp9m"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.123583 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-sk66r"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.123649 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-llx6v"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.125442 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-r8xpr"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.125872 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8pdbx"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.127091 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-cmx7r"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.127695 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-cmx7r" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.131508 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-4xx64"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.132388 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4xx64" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.132523 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5x9mq"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.134063 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-q4h66"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.138054 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.142006 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-k68bp"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.143272 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wszrt"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.147028 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.152256 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nqqmj"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.154877 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-jvz5t"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.154926 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-zptvt"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.154935 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-fgg7z"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.166968 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-b4nfn"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.169160 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dddvm"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.170690 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4xx64"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.172565 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c56d894-8d33-4682-b8ea-4d99fc3bb1b7-client-ca\") pod \"route-controller-manager-6576b87f9c-5x9mq\" (UID: \"2c56d894-8d33-4682-b8ea-4d99fc3bb1b7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5x9mq" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.172606 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f7011ee6-49c9-4b89-ae23-545bac0f68d0-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-wszrt\" (UID: \"f7011ee6-49c9-4b89-ae23-545bac0f68d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wszrt" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.172635 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2aaf7d7a-c623-4198-b68e-7efa895cb96f-config-volume\") pod \"collect-profiles-29563905-l6rnm\" (UID: \"2aaf7d7a-c623-4198-b68e-7efa895cb96f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563905-l6rnm" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.176698 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7a9e071f-3977-4a85-9b66-30c59efd7d3a-audit-dir\") pod \"apiserver-7bbb656c7d-qdbck\" (UID: \"7a9e071f-3977-4a85-9b66-30c59efd7d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qdbck" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.176802 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9b045370-3446-4fe4-b7c4-d0b12ddd06f4-etcd-client\") pod \"etcd-operator-b45778765-6w7zg\" (UID: \"9b045370-3446-4fe4-b7c4-d0b12ddd06f4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6w7zg" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.176827 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d252be2-ace2-40c4-a4e3-3824d9bffff4-serving-cert\") pod \"authentication-operator-69f744f599-k68bp\" (UID: \"3d252be2-ace2-40c4-a4e3-3824d9bffff4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k68bp" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.176875 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6cfd1d38-a35c-47ab-962f-4403875e5a19-audit-dir\") pod \"apiserver-76f77b778f-mmzsn\" (UID: \"6cfd1d38-a35c-47ab-962f-4403875e5a19\") " pod="openshift-apiserver/apiserver-76f77b778f-mmzsn" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.176896 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e7d665e-a211-432c-8884-74096504ab5c-config\") pod \"machine-api-operator-5694c8668f-zptvt\" (UID: \"9e7d665e-a211-432c-8884-74096504ab5c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zptvt" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.176915 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/29015bb8-e604-425d-a88b-db3ec9e10096-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-sw5p7\" (UID: \"29015bb8-e604-425d-a88b-db3ec9e10096\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sw5p7" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.176981 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b045370-3446-4fe4-b7c4-d0b12ddd06f4-config\") pod \"etcd-operator-b45778765-6w7zg\" (UID: \"9b045370-3446-4fe4-b7c4-d0b12ddd06f4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6w7zg" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.177015 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cfd1d38-a35c-47ab-962f-4403875e5a19-config\") pod \"apiserver-76f77b778f-mmzsn\" (UID: \"6cfd1d38-a35c-47ab-962f-4403875e5a19\") " pod="openshift-apiserver/apiserver-76f77b778f-mmzsn" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.177064 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-sk66r\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.177272 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6cfd1d38-a35c-47ab-962f-4403875e5a19-encryption-config\") pod \"apiserver-76f77b778f-mmzsn\" (UID: \"6cfd1d38-a35c-47ab-962f-4403875e5a19\") " pod="openshift-apiserver/apiserver-76f77b778f-mmzsn" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.177297 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x98tf\" (UniqueName: \"kubernetes.io/projected/29015bb8-e604-425d-a88b-db3ec9e10096-kube-api-access-x98tf\") pod \"controller-manager-879f6c89f-sw5p7\" (UID: \"29015bb8-e604-425d-a88b-db3ec9e10096\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sw5p7" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.177320 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a9a21f3-0936-44a9-b1f0-e82345703f0a-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-nqqmj\" (UID: \"4a9a21f3-0936-44a9-b1f0-e82345703f0a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nqqmj" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.177345 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ea2b4f76-5f61-410a-bb32-c1185e630c7a-default-certificate\") pod \"router-default-5444994796-xhsv5\" (UID: \"ea2b4f76-5f61-410a-bb32-c1185e630c7a\") " pod="openshift-ingress/router-default-5444994796-xhsv5" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.177388 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/89a368d8-7c06-47c3-9a5f-0c5cc5a5cd47-images\") pod \"machine-config-operator-74547568cd-b4nfn\" (UID: \"89a368d8-7c06-47c3-9a5f-0c5cc5a5cd47\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b4nfn" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.177412 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8d01f52f-de93-4fa8-80c2-88b21f4e6400-apiservice-cert\") pod \"packageserver-d55dfcdfc-dddvm\" (UID: \"8d01f52f-de93-4fa8-80c2-88b21f4e6400\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dddvm" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.177459 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnzl5\" (UniqueName: \"kubernetes.io/projected/2c56d894-8d33-4682-b8ea-4d99fc3bb1b7-kube-api-access-cnzl5\") pod \"route-controller-manager-6576b87f9c-5x9mq\" (UID: \"2c56d894-8d33-4682-b8ea-4d99fc3bb1b7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5x9mq" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.177482 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9e7d665e-a211-432c-8884-74096504ab5c-images\") pod \"machine-api-operator-5694c8668f-zptvt\" (UID: \"9e7d665e-a211-432c-8884-74096504ab5c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zptvt" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.177503 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bb6cf60-3071-4f09-b9aa-e5d05b03804e-config\") pod \"openshift-apiserver-operator-796bbdcf4f-wbdxj\" (UID: \"9bb6cf60-3071-4f09-b9aa-e5d05b03804e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wbdxj" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.177536 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a9e071f-3977-4a85-9b66-30c59efd7d3a-serving-cert\") pod \"apiserver-7bbb656c7d-qdbck\" (UID: \"7a9e071f-3977-4a85-9b66-30c59efd7d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qdbck" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.177576 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a9e071f-3977-4a85-9b66-30c59efd7d3a-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-qdbck\" (UID: \"7a9e071f-3977-4a85-9b66-30c59efd7d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qdbck" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.177616 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29015bb8-e604-425d-a88b-db3ec9e10096-serving-cert\") pod \"controller-manager-879f6c89f-sw5p7\" (UID: \"29015bb8-e604-425d-a88b-db3ec9e10096\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sw5p7" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.177684 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b0c9543d-7fc2-45cb-a925-f3c030a3c8cf-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-s2wpl\" (UID: \"b0c9543d-7fc2-45cb-a925-f3c030a3c8cf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s2wpl" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.177705 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d252be2-ace2-40c4-a4e3-3824d9bffff4-config\") pod \"authentication-operator-69f744f599-k68bp\" (UID: \"3d252be2-ace2-40c4-a4e3-3824d9bffff4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k68bp" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.177733 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-sk66r\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.177771 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-sk66r\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.177893 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6cfd1d38-a35c-47ab-962f-4403875e5a19-etcd-serving-ca\") pod \"apiserver-76f77b778f-mmzsn\" (UID: \"6cfd1d38-a35c-47ab-962f-4403875e5a19\") " pod="openshift-apiserver/apiserver-76f77b778f-mmzsn" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.177916 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a9a21f3-0936-44a9-b1f0-e82345703f0a-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-nqqmj\" (UID: \"4a9a21f3-0936-44a9-b1f0-e82345703f0a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nqqmj" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.177968 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b1dc0d2-30e8-46f0-a933-cd321c32590b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sflt7\" (UID: \"5b1dc0d2-30e8-46f0-a933-cd321c32590b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sflt7" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.177986 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/89a368d8-7c06-47c3-9a5f-0c5cc5a5cd47-auth-proxy-config\") pod \"machine-config-operator-74547568cd-b4nfn\" (UID: \"89a368d8-7c06-47c3-9a5f-0c5cc5a5cd47\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b4nfn" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.178015 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae68bf3c-11de-49ed-8fc2-a669cb8a04fa-config\") pod \"machine-approver-56656f9798-rwvcf\" (UID: \"ae68bf3c-11de-49ed-8fc2-a669cb8a04fa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rwvcf" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.178035 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z95m\" (UniqueName: \"kubernetes.io/projected/6cfd1d38-a35c-47ab-962f-4403875e5a19-kube-api-access-7z95m\") pod \"apiserver-76f77b778f-mmzsn\" (UID: \"6cfd1d38-a35c-47ab-962f-4403875e5a19\") " pod="openshift-apiserver/apiserver-76f77b778f-mmzsn" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.178055 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/29015bb8-e604-425d-a88b-db3ec9e10096-client-ca\") pod \"controller-manager-879f6c89f-sw5p7\" (UID: \"29015bb8-e604-425d-a88b-db3ec9e10096\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sw5p7" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.178094 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-audit-dir\") pod \"oauth-openshift-558db77b4-sk66r\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.178124 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5cjt\" (UniqueName: \"kubernetes.io/projected/2aaf7d7a-c623-4198-b68e-7efa895cb96f-kube-api-access-b5cjt\") pod \"collect-profiles-29563905-l6rnm\" (UID: \"2aaf7d7a-c623-4198-b68e-7efa895cb96f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563905-l6rnm" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.178151 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b1dc0d2-30e8-46f0-a933-cd321c32590b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sflt7\" (UID: \"5b1dc0d2-30e8-46f0-a933-cd321c32590b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sflt7" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.178190 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-sk66r\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.178211 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea2b4f76-5f61-410a-bb32-c1185e630c7a-metrics-certs\") pod \"router-default-5444994796-xhsv5\" (UID: \"ea2b4f76-5f61-410a-bb32-c1185e630c7a\") " pod="openshift-ingress/router-default-5444994796-xhsv5" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.178213 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7a9e071f-3977-4a85-9b66-30c59efd7d3a-audit-dir\") pod \"apiserver-7bbb656c7d-qdbck\" (UID: \"7a9e071f-3977-4a85-9b66-30c59efd7d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qdbck" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.178255 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbblb\" (UniqueName: \"kubernetes.io/projected/ea2b4f76-5f61-410a-bb32-c1185e630c7a-kube-api-access-jbblb\") pod \"router-default-5444994796-xhsv5\" (UID: \"ea2b4f76-5f61-410a-bb32-c1185e630c7a\") " pod="openshift-ingress/router-default-5444994796-xhsv5" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.178278 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jddcd\" (UniqueName: \"kubernetes.io/projected/9b045370-3446-4fe4-b7c4-d0b12ddd06f4-kube-api-access-jddcd\") pod \"etcd-operator-b45778765-6w7zg\" (UID: \"9b045370-3446-4fe4-b7c4-d0b12ddd06f4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6w7zg" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.178300 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6cfd1d38-a35c-47ab-962f-4403875e5a19-image-import-ca\") pod \"apiserver-76f77b778f-mmzsn\" (UID: \"6cfd1d38-a35c-47ab-962f-4403875e5a19\") " pod="openshift-apiserver/apiserver-76f77b778f-mmzsn" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.178321 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b27563b-d715-4dfa-84b6-5d0f1d90e4b5-serving-cert\") pod \"openshift-config-operator-7777fb866f-fs56m\" (UID: \"8b27563b-d715-4dfa-84b6-5d0f1d90e4b5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fs56m" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.178356 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48wdr\" (UniqueName: \"kubernetes.io/projected/ae68bf3c-11de-49ed-8fc2-a669cb8a04fa-kube-api-access-48wdr\") pod \"machine-approver-56656f9798-rwvcf\" (UID: \"ae68bf3c-11de-49ed-8fc2-a669cb8a04fa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rwvcf" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.178377 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-sk66r\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.178399 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4v994\" (UniqueName: \"kubernetes.io/projected/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-kube-api-access-4v994\") pod \"oauth-openshift-558db77b4-sk66r\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.178437 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9b045370-3446-4fe4-b7c4-d0b12ddd06f4-etcd-ca\") pod \"etcd-operator-b45778765-6w7zg\" (UID: \"9b045370-3446-4fe4-b7c4-d0b12ddd06f4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6w7zg" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.178456 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xsqf\" (UniqueName: \"kubernetes.io/projected/7a9e071f-3977-4a85-9b66-30c59efd7d3a-kube-api-access-8xsqf\") pod \"apiserver-7bbb656c7d-qdbck\" (UID: \"7a9e071f-3977-4a85-9b66-30c59efd7d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qdbck" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.178475 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cfd1d38-a35c-47ab-962f-4403875e5a19-serving-cert\") pod \"apiserver-76f77b778f-mmzsn\" (UID: \"6cfd1d38-a35c-47ab-962f-4403875e5a19\") " pod="openshift-apiserver/apiserver-76f77b778f-mmzsn" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.178495 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8fnf\" (UniqueName: \"kubernetes.io/projected/9bb6cf60-3071-4f09-b9aa-e5d05b03804e-kube-api-access-b8fnf\") pod \"openshift-apiserver-operator-796bbdcf4f-wbdxj\" (UID: \"9bb6cf60-3071-4f09-b9aa-e5d05b03804e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wbdxj" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.178515 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7011ee6-49c9-4b89-ae23-545bac0f68d0-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-wszrt\" (UID: \"f7011ee6-49c9-4b89-ae23-545bac0f68d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wszrt" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.178541 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-sk66r\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.178561 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c3e59a9f-c0ea-4a0e-a39a-f332e3eba1a4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-fgg7z\" (UID: \"c3e59a9f-c0ea-4a0e-a39a-f332e3eba1a4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fgg7z" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.178581 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b1dc0d2-30e8-46f0-a933-cd321c32590b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sflt7\" (UID: \"5b1dc0d2-30e8-46f0-a933-cd321c32590b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sflt7" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.178605 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5t4x\" (UniqueName: \"kubernetes.io/projected/8b27563b-d715-4dfa-84b6-5d0f1d90e4b5-kube-api-access-n5t4x\") pod \"openshift-config-operator-7777fb866f-fs56m\" (UID: \"8b27563b-d715-4dfa-84b6-5d0f1d90e4b5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fs56m" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.178634 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea2b4f76-5f61-410a-bb32-c1185e630c7a-service-ca-bundle\") pod \"router-default-5444994796-xhsv5\" (UID: \"ea2b4f76-5f61-410a-bb32-c1185e630c7a\") " pod="openshift-ingress/router-default-5444994796-xhsv5" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.178692 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-sk66r\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.178729 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-sk66r\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.178777 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d252be2-ace2-40c4-a4e3-3824d9bffff4-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-k68bp\" (UID: \"3d252be2-ace2-40c4-a4e3-3824d9bffff4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k68bp" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.178800 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bb6cf60-3071-4f09-b9aa-e5d05b03804e-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-wbdxj\" (UID: \"9bb6cf60-3071-4f09-b9aa-e5d05b03804e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wbdxj" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.178845 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/89a368d8-7c06-47c3-9a5f-0c5cc5a5cd47-proxy-tls\") pod \"machine-config-operator-74547568cd-b4nfn\" (UID: \"89a368d8-7c06-47c3-9a5f-0c5cc5a5cd47\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b4nfn" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.178870 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d252be2-ace2-40c4-a4e3-3824d9bffff4-service-ca-bundle\") pod \"authentication-operator-69f744f599-k68bp\" (UID: \"3d252be2-ace2-40c4-a4e3-3824d9bffff4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k68bp" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.178892 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29015bb8-e604-425d-a88b-db3ec9e10096-config\") pod \"controller-manager-879f6c89f-sw5p7\" (UID: \"29015bb8-e604-425d-a88b-db3ec9e10096\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sw5p7" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.178921 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6cfd1d38-a35c-47ab-962f-4403875e5a19-etcd-client\") pod \"apiserver-76f77b778f-mmzsn\" (UID: \"6cfd1d38-a35c-47ab-962f-4403875e5a19\") " pod="openshift-apiserver/apiserver-76f77b778f-mmzsn" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.178945 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2aaf7d7a-c623-4198-b68e-7efa895cb96f-secret-volume\") pod \"collect-profiles-29563905-l6rnm\" (UID: \"2aaf7d7a-c623-4198-b68e-7efa895cb96f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563905-l6rnm" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.178971 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55v4n\" (UniqueName: \"kubernetes.io/projected/3d252be2-ace2-40c4-a4e3-3824d9bffff4-kube-api-access-55v4n\") pod \"authentication-operator-69f744f599-k68bp\" (UID: \"3d252be2-ace2-40c4-a4e3-3824d9bffff4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k68bp" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.178994 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g47l\" (UniqueName: \"kubernetes.io/projected/b0c9543d-7fc2-45cb-a925-f3c030a3c8cf-kube-api-access-2g47l\") pod \"cluster-samples-operator-665b6dd947-s2wpl\" (UID: \"b0c9543d-7fc2-45cb-a925-f3c030a3c8cf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s2wpl" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.179022 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-audit-policies\") pod \"oauth-openshift-558db77b4-sk66r\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.179043 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9b045370-3446-4fe4-b7c4-d0b12ddd06f4-etcd-service-ca\") pod \"etcd-operator-b45778765-6w7zg\" (UID: \"9b045370-3446-4fe4-b7c4-d0b12ddd06f4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6w7zg" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.179065 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/081ced5c-b889-43c7-a7a5-bb6561452f4a-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-llx6v\" (UID: \"081ced5c-b889-43c7-a7a5-bb6561452f4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-llx6v" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.179084 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm7tl\" (UniqueName: \"kubernetes.io/projected/081ced5c-b889-43c7-a7a5-bb6561452f4a-kube-api-access-wm7tl\") pod \"kube-storage-version-migrator-operator-b67b599dd-llx6v\" (UID: \"081ced5c-b889-43c7-a7a5-bb6561452f4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-llx6v" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.179122 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c56d894-8d33-4682-b8ea-4d99fc3bb1b7-serving-cert\") pod \"route-controller-manager-6576b87f9c-5x9mq\" (UID: \"2c56d894-8d33-4682-b8ea-4d99fc3bb1b7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5x9mq" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.179154 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7a9e071f-3977-4a85-9b66-30c59efd7d3a-etcd-client\") pod \"apiserver-7bbb656c7d-qdbck\" (UID: \"7a9e071f-3977-4a85-9b66-30c59efd7d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qdbck" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.179173 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7a9e071f-3977-4a85-9b66-30c59efd7d3a-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-qdbck\" (UID: \"7a9e071f-3977-4a85-9b66-30c59efd7d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qdbck" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.179198 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7011ee6-49c9-4b89-ae23-545bac0f68d0-config\") pod \"kube-apiserver-operator-766d6c64bb-wszrt\" (UID: \"f7011ee6-49c9-4b89-ae23-545bac0f68d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wszrt" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.179229 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9e7d665e-a211-432c-8884-74096504ab5c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-zptvt\" (UID: \"9e7d665e-a211-432c-8884-74096504ab5c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zptvt" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.179257 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/081ced5c-b889-43c7-a7a5-bb6561452f4a-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-llx6v\" (UID: \"081ced5c-b889-43c7-a7a5-bb6561452f4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-llx6v" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.179284 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnzrx\" (UniqueName: \"kubernetes.io/projected/c3e59a9f-c0ea-4a0e-a39a-f332e3eba1a4-kube-api-access-mnzrx\") pod \"multus-admission-controller-857f4d67dd-fgg7z\" (UID: \"c3e59a9f-c0ea-4a0e-a39a-f332e3eba1a4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fgg7z" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.179308 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8d01f52f-de93-4fa8-80c2-88b21f4e6400-webhook-cert\") pod \"packageserver-d55dfcdfc-dddvm\" (UID: \"8d01f52f-de93-4fa8-80c2-88b21f4e6400\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dddvm" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.179325 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/8b27563b-d715-4dfa-84b6-5d0f1d90e4b5-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fs56m\" (UID: \"8b27563b-d715-4dfa-84b6-5d0f1d90e4b5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fs56m" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.179365 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ea2b4f76-5f61-410a-bb32-c1185e630c7a-stats-auth\") pod \"router-default-5444994796-xhsv5\" (UID: \"ea2b4f76-5f61-410a-bb32-c1185e630c7a\") " pod="openshift-ingress/router-default-5444994796-xhsv5" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.179389 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wkgp\" (UniqueName: \"kubernetes.io/projected/8d01f52f-de93-4fa8-80c2-88b21f4e6400-kube-api-access-7wkgp\") pod \"packageserver-d55dfcdfc-dddvm\" (UID: \"8d01f52f-de93-4fa8-80c2-88b21f4e6400\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dddvm" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.179410 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnqmf\" (UniqueName: \"kubernetes.io/projected/4a9a21f3-0936-44a9-b1f0-e82345703f0a-kube-api-access-wnqmf\") pod \"openshift-controller-manager-operator-756b6f6bc6-nqqmj\" (UID: \"4a9a21f3-0936-44a9-b1f0-e82345703f0a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nqqmj" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.179432 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ae68bf3c-11de-49ed-8fc2-a669cb8a04fa-auth-proxy-config\") pod \"machine-approver-56656f9798-rwvcf\" (UID: \"ae68bf3c-11de-49ed-8fc2-a669cb8a04fa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rwvcf" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.179452 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-sk66r\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.179471 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6cfd1d38-a35c-47ab-962f-4403875e5a19-node-pullsecrets\") pod \"apiserver-76f77b778f-mmzsn\" (UID: \"6cfd1d38-a35c-47ab-962f-4403875e5a19\") " pod="openshift-apiserver/apiserver-76f77b778f-mmzsn" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.179491 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6cfd1d38-a35c-47ab-962f-4403875e5a19-audit\") pod \"apiserver-76f77b778f-mmzsn\" (UID: \"6cfd1d38-a35c-47ab-962f-4403875e5a19\") " pod="openshift-apiserver/apiserver-76f77b778f-mmzsn" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.179514 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-sk66r\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.179531 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/ae68bf3c-11de-49ed-8fc2-a669cb8a04fa-machine-approver-tls\") pod \"machine-approver-56656f9798-rwvcf\" (UID: \"ae68bf3c-11de-49ed-8fc2-a669cb8a04fa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rwvcf" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.179553 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c56d894-8d33-4682-b8ea-4d99fc3bb1b7-config\") pod \"route-controller-manager-6576b87f9c-5x9mq\" (UID: \"2c56d894-8d33-4682-b8ea-4d99fc3bb1b7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5x9mq" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.179574 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-sk66r\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.179595 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67bv4\" (UniqueName: \"kubernetes.io/projected/89a368d8-7c06-47c3-9a5f-0c5cc5a5cd47-kube-api-access-67bv4\") pod \"machine-config-operator-74547568cd-b4nfn\" (UID: \"89a368d8-7c06-47c3-9a5f-0c5cc5a5cd47\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b4nfn" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.179628 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b045370-3446-4fe4-b7c4-d0b12ddd06f4-serving-cert\") pod \"etcd-operator-b45778765-6w7zg\" (UID: \"9b045370-3446-4fe4-b7c4-d0b12ddd06f4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6w7zg" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.179697 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6cfd1d38-a35c-47ab-962f-4403875e5a19-trusted-ca-bundle\") pod \"apiserver-76f77b778f-mmzsn\" (UID: \"6cfd1d38-a35c-47ab-962f-4403875e5a19\") " pod="openshift-apiserver/apiserver-76f77b778f-mmzsn" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.179719 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2df2\" (UniqueName: \"kubernetes.io/projected/9e7d665e-a211-432c-8884-74096504ab5c-kube-api-access-h2df2\") pod \"machine-api-operator-5694c8668f-zptvt\" (UID: \"9e7d665e-a211-432c-8884-74096504ab5c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zptvt" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.179736 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/8d01f52f-de93-4fa8-80c2-88b21f4e6400-tmpfs\") pod \"packageserver-d55dfcdfc-dddvm\" (UID: \"8d01f52f-de93-4fa8-80c2-88b21f4e6400\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dddvm" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.179757 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7a9e071f-3977-4a85-9b66-30c59efd7d3a-encryption-config\") pod \"apiserver-7bbb656c7d-qdbck\" (UID: \"7a9e071f-3977-4a85-9b66-30c59efd7d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qdbck" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.179777 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7a9e071f-3977-4a85-9b66-30c59efd7d3a-audit-policies\") pod \"apiserver-7bbb656c7d-qdbck\" (UID: \"7a9e071f-3977-4a85-9b66-30c59efd7d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qdbck" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.180483 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6cfd1d38-a35c-47ab-962f-4403875e5a19-audit-dir\") pod \"apiserver-76f77b778f-mmzsn\" (UID: \"6cfd1d38-a35c-47ab-962f-4403875e5a19\") " pod="openshift-apiserver/apiserver-76f77b778f-mmzsn" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.180543 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7a9e071f-3977-4a85-9b66-30c59efd7d3a-audit-policies\") pod \"apiserver-7bbb656c7d-qdbck\" (UID: \"7a9e071f-3977-4a85-9b66-30c59efd7d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qdbck" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.181407 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e7d665e-a211-432c-8884-74096504ab5c-config\") pod \"machine-api-operator-5694c8668f-zptvt\" (UID: \"9e7d665e-a211-432c-8884-74096504ab5c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zptvt" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.185589 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d252be2-ace2-40c4-a4e3-3824d9bffff4-serving-cert\") pod \"authentication-operator-69f744f599-k68bp\" (UID: \"3d252be2-ace2-40c4-a4e3-3824d9bffff4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k68bp" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.191954 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.198223 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9b045370-3446-4fe4-b7c4-d0b12ddd06f4-etcd-client\") pod \"etcd-operator-b45778765-6w7zg\" (UID: \"9b045370-3446-4fe4-b7c4-d0b12ddd06f4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6w7zg" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.198638 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b045370-3446-4fe4-b7c4-d0b12ddd06f4-config\") pod \"etcd-operator-b45778765-6w7zg\" (UID: \"9b045370-3446-4fe4-b7c4-d0b12ddd06f4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6w7zg" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.199691 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d252be2-ace2-40c4-a4e3-3824d9bffff4-service-ca-bundle\") pod \"authentication-operator-69f744f599-k68bp\" (UID: \"3d252be2-ace2-40c4-a4e3-3824d9bffff4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k68bp" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.200202 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6cfd1d38-a35c-47ab-962f-4403875e5a19-audit\") pod \"apiserver-76f77b778f-mmzsn\" (UID: \"6cfd1d38-a35c-47ab-962f-4403875e5a19\") " pod="openshift-apiserver/apiserver-76f77b778f-mmzsn" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.200589 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/8b27563b-d715-4dfa-84b6-5d0f1d90e4b5-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fs56m\" (UID: \"8b27563b-d715-4dfa-84b6-5d0f1d90e4b5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fs56m" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.201247 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ae68bf3c-11de-49ed-8fc2-a669cb8a04fa-auth-proxy-config\") pod \"machine-approver-56656f9798-rwvcf\" (UID: \"ae68bf3c-11de-49ed-8fc2-a669cb8a04fa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rwvcf" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.202180 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/89a368d8-7c06-47c3-9a5f-0c5cc5a5cd47-auth-proxy-config\") pod \"machine-config-operator-74547568cd-b4nfn\" (UID: \"89a368d8-7c06-47c3-9a5f-0c5cc5a5cd47\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b4nfn" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.202495 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae68bf3c-11de-49ed-8fc2-a669cb8a04fa-config\") pod \"machine-approver-56656f9798-rwvcf\" (UID: \"ae68bf3c-11de-49ed-8fc2-a669cb8a04fa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rwvcf" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.202926 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-audit-dir\") pod \"oauth-openshift-558db77b4-sk66r\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.205282 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-audit-policies\") pod \"oauth-openshift-558db77b4-sk66r\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.205700 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9b045370-3446-4fe4-b7c4-d0b12ddd06f4-etcd-service-ca\") pod \"etcd-operator-b45778765-6w7zg\" (UID: \"9b045370-3446-4fe4-b7c4-d0b12ddd06f4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6w7zg" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.213530 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6cfd1d38-a35c-47ab-962f-4403875e5a19-node-pullsecrets\") pod \"apiserver-76f77b778f-mmzsn\" (UID: \"6cfd1d38-a35c-47ab-962f-4403875e5a19\") " pod="openshift-apiserver/apiserver-76f77b778f-mmzsn" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.213574 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9b045370-3446-4fe4-b7c4-d0b12ddd06f4-etcd-ca\") pod \"etcd-operator-b45778765-6w7zg\" (UID: \"9b045370-3446-4fe4-b7c4-d0b12ddd06f4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6w7zg" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.214153 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7a9e071f-3977-4a85-9b66-30c59efd7d3a-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-qdbck\" (UID: \"7a9e071f-3977-4a85-9b66-30c59efd7d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qdbck" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.214620 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/8d01f52f-de93-4fa8-80c2-88b21f4e6400-tmpfs\") pod \"packageserver-d55dfcdfc-dddvm\" (UID: \"8d01f52f-de93-4fa8-80c2-88b21f4e6400\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dddvm" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.214906 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a9e071f-3977-4a85-9b66-30c59efd7d3a-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-qdbck\" (UID: \"7a9e071f-3977-4a85-9b66-30c59efd7d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qdbck" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.215719 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bb6cf60-3071-4f09-b9aa-e5d05b03804e-config\") pod \"openshift-apiserver-operator-796bbdcf4f-wbdxj\" (UID: \"9bb6cf60-3071-4f09-b9aa-e5d05b03804e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wbdxj" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.215910 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d252be2-ace2-40c4-a4e3-3824d9bffff4-config\") pod \"authentication-operator-69f744f599-k68bp\" (UID: \"3d252be2-ace2-40c4-a4e3-3824d9bffff4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k68bp" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.216214 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-sk66r\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.216386 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6cfd1d38-a35c-47ab-962f-4403875e5a19-etcd-serving-ca\") pod \"apiserver-76f77b778f-mmzsn\" (UID: \"6cfd1d38-a35c-47ab-962f-4403875e5a19\") " pod="openshift-apiserver/apiserver-76f77b778f-mmzsn" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.216859 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a9a21f3-0936-44a9-b1f0-e82345703f0a-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-nqqmj\" (UID: \"4a9a21f3-0936-44a9-b1f0-e82345703f0a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nqqmj" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.220104 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-sk66r\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.220501 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b045370-3446-4fe4-b7c4-d0b12ddd06f4-serving-cert\") pod \"etcd-operator-b45778765-6w7zg\" (UID: \"9b045370-3446-4fe4-b7c4-d0b12ddd06f4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6w7zg" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.220627 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/29015bb8-e604-425d-a88b-db3ec9e10096-client-ca\") pod \"controller-manager-879f6c89f-sw5p7\" (UID: \"29015bb8-e604-425d-a88b-db3ec9e10096\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sw5p7" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.221072 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6cfd1d38-a35c-47ab-962f-4403875e5a19-image-import-ca\") pod \"apiserver-76f77b778f-mmzsn\" (UID: \"6cfd1d38-a35c-47ab-962f-4403875e5a19\") " pod="openshift-apiserver/apiserver-76f77b778f-mmzsn" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.221209 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d252be2-ace2-40c4-a4e3-3824d9bffff4-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-k68bp\" (UID: \"3d252be2-ace2-40c4-a4e3-3824d9bffff4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k68bp" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.221477 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cfd1d38-a35c-47ab-962f-4403875e5a19-serving-cert\") pod \"apiserver-76f77b778f-mmzsn\" (UID: \"6cfd1d38-a35c-47ab-962f-4403875e5a19\") " pod="openshift-apiserver/apiserver-76f77b778f-mmzsn" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.221707 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/29015bb8-e604-425d-a88b-db3ec9e10096-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-sw5p7\" (UID: \"29015bb8-e604-425d-a88b-db3ec9e10096\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sw5p7" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.221796 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-sk66r\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.200145 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cfd1d38-a35c-47ab-962f-4403875e5a19-config\") pod \"apiserver-76f77b778f-mmzsn\" (UID: \"6cfd1d38-a35c-47ab-962f-4403875e5a19\") " pod="openshift-apiserver/apiserver-76f77b778f-mmzsn" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.222006 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-sk66r\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.222028 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-sk66r\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.201649 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-sk66r\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.222215 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6cfd1d38-a35c-47ab-962f-4403875e5a19-trusted-ca-bundle\") pod \"apiserver-76f77b778f-mmzsn\" (UID: \"6cfd1d38-a35c-47ab-962f-4403875e5a19\") " pod="openshift-apiserver/apiserver-76f77b778f-mmzsn" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.222334 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6cfd1d38-a35c-47ab-962f-4403875e5a19-encryption-config\") pod \"apiserver-76f77b778f-mmzsn\" (UID: \"6cfd1d38-a35c-47ab-962f-4403875e5a19\") " pod="openshift-apiserver/apiserver-76f77b778f-mmzsn" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.222703 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bb6cf60-3071-4f09-b9aa-e5d05b03804e-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-wbdxj\" (UID: \"9bb6cf60-3071-4f09-b9aa-e5d05b03804e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wbdxj" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.222947 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9e7d665e-a211-432c-8884-74096504ab5c-images\") pod \"machine-api-operator-5694c8668f-zptvt\" (UID: \"9e7d665e-a211-432c-8884-74096504ab5c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zptvt" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.223075 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29015bb8-e604-425d-a88b-db3ec9e10096-config\") pod \"controller-manager-879f6c89f-sw5p7\" (UID: \"29015bb8-e604-425d-a88b-db3ec9e10096\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sw5p7" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.223238 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b0c9543d-7fc2-45cb-a925-f3c030a3c8cf-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-s2wpl\" (UID: \"b0c9543d-7fc2-45cb-a925-f3c030a3c8cf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s2wpl" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.223345 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sflt7"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.223351 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9e7d665e-a211-432c-8884-74096504ab5c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-zptvt\" (UID: \"9e7d665e-a211-432c-8884-74096504ab5c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zptvt" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.223594 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.223638 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a9a21f3-0936-44a9-b1f0-e82345703f0a-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-nqqmj\" (UID: \"4a9a21f3-0936-44a9-b1f0-e82345703f0a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nqqmj" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.223697 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a9e071f-3977-4a85-9b66-30c59efd7d3a-serving-cert\") pod \"apiserver-7bbb656c7d-qdbck\" (UID: \"7a9e071f-3977-4a85-9b66-30c59efd7d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qdbck" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.223836 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-sk66r\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.223858 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7a9e071f-3977-4a85-9b66-30c59efd7d3a-encryption-config\") pod \"apiserver-7bbb656c7d-qdbck\" (UID: \"7a9e071f-3977-4a85-9b66-30c59efd7d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qdbck" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.223875 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7a9e071f-3977-4a85-9b66-30c59efd7d3a-etcd-client\") pod \"apiserver-7bbb656c7d-qdbck\" (UID: \"7a9e071f-3977-4a85-9b66-30c59efd7d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qdbck" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.223985 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b27563b-d715-4dfa-84b6-5d0f1d90e4b5-serving-cert\") pod \"openshift-config-operator-7777fb866f-fs56m\" (UID: \"8b27563b-d715-4dfa-84b6-5d0f1d90e4b5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fs56m" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.224177 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-sk66r\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.224269 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/ae68bf3c-11de-49ed-8fc2-a669cb8a04fa-machine-approver-tls\") pod \"machine-approver-56656f9798-rwvcf\" (UID: \"ae68bf3c-11de-49ed-8fc2-a669cb8a04fa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rwvcf" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.224285 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6cfd1d38-a35c-47ab-962f-4403875e5a19-etcd-client\") pod \"apiserver-76f77b778f-mmzsn\" (UID: \"6cfd1d38-a35c-47ab-962f-4403875e5a19\") " pod="openshift-apiserver/apiserver-76f77b778f-mmzsn" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.224416 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4sskq"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.224427 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-sk66r\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.224829 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.226809 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-mmzsn"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.226849 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zmt4c"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.227957 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cv5z6"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.228886 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bksfk"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.229804 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.229991 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nb2r4"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.230625 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-w9kwc"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.231733 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-2dm7f"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.233073 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-kwbct"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.233602 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c56d894-8d33-4682-b8ea-4d99fc3bb1b7-serving-cert\") pod \"route-controller-manager-6576b87f9c-5x9mq\" (UID: \"2c56d894-8d33-4682-b8ea-4d99fc3bb1b7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5x9mq" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.234424 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-xjqrf"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.235711 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-xjqrf"] Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.235981 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-xjqrf" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.236679 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-sk66r\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.241757 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-sk66r\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.242421 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29015bb8-e604-425d-a88b-db3ec9e10096-serving-cert\") pod \"controller-manager-879f6c89f-sw5p7\" (UID: \"29015bb8-e604-425d-a88b-db3ec9e10096\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sw5p7" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.247688 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.253083 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c56d894-8d33-4682-b8ea-4d99fc3bb1b7-config\") pod \"route-controller-manager-6576b87f9c-5x9mq\" (UID: \"2c56d894-8d33-4682-b8ea-4d99fc3bb1b7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5x9mq" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.267199 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.268999 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c56d894-8d33-4682-b8ea-4d99fc3bb1b7-client-ca\") pod \"route-controller-manager-6576b87f9c-5x9mq\" (UID: \"2c56d894-8d33-4682-b8ea-4d99fc3bb1b7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5x9mq" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.288459 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.307512 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.328245 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.348015 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.354131 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7011ee6-49c9-4b89-ae23-545bac0f68d0-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-wszrt\" (UID: \"f7011ee6-49c9-4b89-ae23-545bac0f68d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wszrt" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.367558 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.375022 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7011ee6-49c9-4b89-ae23-545bac0f68d0-config\") pod \"kube-apiserver-operator-766d6c64bb-wszrt\" (UID: \"f7011ee6-49c9-4b89-ae23-545bac0f68d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wszrt" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.387228 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.407175 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.432449 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.447188 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.468461 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.487779 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.506820 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.528502 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.536979 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2aaf7d7a-c623-4198-b68e-7efa895cb96f-secret-volume\") pod \"collect-profiles-29563905-l6rnm\" (UID: \"2aaf7d7a-c623-4198-b68e-7efa895cb96f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563905-l6rnm" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.549090 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.568201 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.587267 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.589322 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2aaf7d7a-c623-4198-b68e-7efa895cb96f-config-volume\") pod \"collect-profiles-29563905-l6rnm\" (UID: \"2aaf7d7a-c623-4198-b68e-7efa895cb96f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563905-l6rnm" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.608480 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.634314 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.646362 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/081ced5c-b889-43c7-a7a5-bb6561452f4a-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-llx6v\" (UID: \"081ced5c-b889-43c7-a7a5-bb6561452f4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-llx6v" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.647565 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.653755 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/081ced5c-b889-43c7-a7a5-bb6561452f4a-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-llx6v\" (UID: \"081ced5c-b889-43c7-a7a5-bb6561452f4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-llx6v" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.668582 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.707631 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.728026 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.738436 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8d01f52f-de93-4fa8-80c2-88b21f4e6400-apiservice-cert\") pod \"packageserver-d55dfcdfc-dddvm\" (UID: \"8d01f52f-de93-4fa8-80c2-88b21f4e6400\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dddvm" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.739763 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8d01f52f-de93-4fa8-80c2-88b21f4e6400-webhook-cert\") pod \"packageserver-d55dfcdfc-dddvm\" (UID: \"8d01f52f-de93-4fa8-80c2-88b21f4e6400\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dddvm" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.748051 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.768375 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.787114 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.807875 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.827647 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.848651 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.851508 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b1dc0d2-30e8-46f0-a933-cd321c32590b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sflt7\" (UID: \"5b1dc0d2-30e8-46f0-a933-cd321c32590b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sflt7" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.868836 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.887823 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.897047 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b1dc0d2-30e8-46f0-a933-cd321c32590b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sflt7\" (UID: \"5b1dc0d2-30e8-46f0-a933-cd321c32590b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sflt7" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.907834 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.928063 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.948405 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.967758 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 18 11:58:23 crc kubenswrapper[4965]: I0318 11:58:23.988886 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.008035 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.020974 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9jx9z" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.025927 4965 request.go:700] Waited for 1.00646557s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager-operator/configmaps?fieldSelector=metadata.name%3Dkube-controller-manager-operator-config&limit=500&resourceVersion=0 Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.028304 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.048770 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.068536 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.079389 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ea2b4f76-5f61-410a-bb32-c1185e630c7a-default-certificate\") pod \"router-default-5444994796-xhsv5\" (UID: \"ea2b4f76-5f61-410a-bb32-c1185e630c7a\") " pod="openshift-ingress/router-default-5444994796-xhsv5" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.088973 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.106898 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.114639 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea2b4f76-5f61-410a-bb32-c1185e630c7a-service-ca-bundle\") pod \"router-default-5444994796-xhsv5\" (UID: \"ea2b4f76-5f61-410a-bb32-c1185e630c7a\") " pod="openshift-ingress/router-default-5444994796-xhsv5" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.129345 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.139104 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ea2b4f76-5f61-410a-bb32-c1185e630c7a-stats-auth\") pod \"router-default-5444994796-xhsv5\" (UID: \"ea2b4f76-5f61-410a-bb32-c1185e630c7a\") " pod="openshift-ingress/router-default-5444994796-xhsv5" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.148075 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.168423 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.188007 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.207880 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 18 11:58:24 crc kubenswrapper[4965]: E0318 11:58:24.221057 4965 secret.go:188] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: failed to sync secret cache: timed out waiting for the condition Mar 18 11:58:24 crc kubenswrapper[4965]: E0318 11:58:24.221165 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89a368d8-7c06-47c3-9a5f-0c5cc5a5cd47-proxy-tls podName:89a368d8-7c06-47c3-9a5f-0c5cc5a5cd47 nodeName:}" failed. No retries permitted until 2026-03-18 11:58:24.721135998 +0000 UTC m=+109.707323517 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/89a368d8-7c06-47c3-9a5f-0c5cc5a5cd47-proxy-tls") pod "machine-config-operator-74547568cd-b4nfn" (UID: "89a368d8-7c06-47c3-9a5f-0c5cc5a5cd47") : failed to sync secret cache: timed out waiting for the condition Mar 18 11:58:24 crc kubenswrapper[4965]: E0318 11:58:24.222169 4965 secret.go:188] Couldn't get secret openshift-ingress/router-metrics-certs-default: failed to sync secret cache: timed out waiting for the condition Mar 18 11:58:24 crc kubenswrapper[4965]: E0318 11:58:24.222242 4965 secret.go:188] Couldn't get secret openshift-multus/multus-admission-controller-secret: failed to sync secret cache: timed out waiting for the condition Mar 18 11:58:24 crc kubenswrapper[4965]: E0318 11:58:24.222257 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea2b4f76-5f61-410a-bb32-c1185e630c7a-metrics-certs podName:ea2b4f76-5f61-410a-bb32-c1185e630c7a nodeName:}" failed. No retries permitted until 2026-03-18 11:58:24.722238708 +0000 UTC m=+109.708426187 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea2b4f76-5f61-410a-bb32-c1185e630c7a-metrics-certs") pod "router-default-5444994796-xhsv5" (UID: "ea2b4f76-5f61-410a-bb32-c1185e630c7a") : failed to sync secret cache: timed out waiting for the condition Mar 18 11:58:24 crc kubenswrapper[4965]: E0318 11:58:24.222392 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3e59a9f-c0ea-4a0e-a39a-f332e3eba1a4-webhook-certs podName:c3e59a9f-c0ea-4a0e-a39a-f332e3eba1a4 nodeName:}" failed. No retries permitted until 2026-03-18 11:58:24.722350292 +0000 UTC m=+109.708537771 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c3e59a9f-c0ea-4a0e-a39a-f332e3eba1a4-webhook-certs") pod "multus-admission-controller-857f4d67dd-fgg7z" (UID: "c3e59a9f-c0ea-4a0e-a39a-f332e3eba1a4") : failed to sync secret cache: timed out waiting for the condition Mar 18 11:58:24 crc kubenswrapper[4965]: E0318 11:58:24.222471 4965 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/machine-config-operator-images: failed to sync configmap cache: timed out waiting for the condition Mar 18 11:58:24 crc kubenswrapper[4965]: E0318 11:58:24.222539 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/89a368d8-7c06-47c3-9a5f-0c5cc5a5cd47-images podName:89a368d8-7c06-47c3-9a5f-0c5cc5a5cd47 nodeName:}" failed. No retries permitted until 2026-03-18 11:58:24.722523846 +0000 UTC m=+109.708711415 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/89a368d8-7c06-47c3-9a5f-0c5cc5a5cd47-images") pod "machine-config-operator-74547568cd-b4nfn" (UID: "89a368d8-7c06-47c3-9a5f-0c5cc5a5cd47") : failed to sync configmap cache: timed out waiting for the condition Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.226970 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.247941 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.275041 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.288799 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.327854 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.347582 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.368409 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.388708 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.407477 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.428792 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.448495 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.468412 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.487992 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.507288 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.528335 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.548242 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.568587 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.588343 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.609202 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.647518 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92wjp\" (UniqueName: \"kubernetes.io/projected/36b1bb7c-dba1-491c-84ba-820adbe96f3a-kube-api-access-92wjp\") pod \"dns-operator-744455d44c-ntz44\" (UID: \"36b1bb7c-dba1-491c-84ba-820adbe96f3a\") " pod="openshift-dns-operator/dns-operator-744455d44c-ntz44" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.648611 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.688423 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.689383 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm4nn\" (UniqueName: \"kubernetes.io/projected/00cf43a5-c3a1-4a4a-a49c-3707bb515d59-kube-api-access-xm4nn\") pod \"console-operator-58897d9998-qtnkp\" (UID: \"00cf43a5-c3a1-4a4a-a49c-3707bb515d59\") " pod="openshift-console-operator/console-operator-58897d9998-qtnkp" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.729652 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.740712 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfzv8\" (UniqueName: \"kubernetes.io/projected/bf0cad60-78bb-4325-aaaa-ee2636410fcb-kube-api-access-gfzv8\") pod \"console-f9d7485db-pj8km\" (UID: \"bf0cad60-78bb-4325-aaaa-ee2636410fcb\") " pod="openshift-console/console-f9d7485db-pj8km" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.747998 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.764404 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-qtnkp" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.768239 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.787947 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.797718 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-pj8km" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.802471 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/89a368d8-7c06-47c3-9a5f-0c5cc5a5cd47-images\") pod \"machine-config-operator-74547568cd-b4nfn\" (UID: \"89a368d8-7c06-47c3-9a5f-0c5cc5a5cd47\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b4nfn" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.802571 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea2b4f76-5f61-410a-bb32-c1185e630c7a-metrics-certs\") pod \"router-default-5444994796-xhsv5\" (UID: \"ea2b4f76-5f61-410a-bb32-c1185e630c7a\") " pod="openshift-ingress/router-default-5444994796-xhsv5" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.802646 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c3e59a9f-c0ea-4a0e-a39a-f332e3eba1a4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-fgg7z\" (UID: \"c3e59a9f-c0ea-4a0e-a39a-f332e3eba1a4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fgg7z" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.802729 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/89a368d8-7c06-47c3-9a5f-0c5cc5a5cd47-proxy-tls\") pod \"machine-config-operator-74547568cd-b4nfn\" (UID: \"89a368d8-7c06-47c3-9a5f-0c5cc5a5cd47\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b4nfn" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.803956 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/89a368d8-7c06-47c3-9a5f-0c5cc5a5cd47-images\") pod \"machine-config-operator-74547568cd-b4nfn\" (UID: \"89a368d8-7c06-47c3-9a5f-0c5cc5a5cd47\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b4nfn" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.807277 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c3e59a9f-c0ea-4a0e-a39a-f332e3eba1a4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-fgg7z\" (UID: \"c3e59a9f-c0ea-4a0e-a39a-f332e3eba1a4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fgg7z" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.807558 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea2b4f76-5f61-410a-bb32-c1185e630c7a-metrics-certs\") pod \"router-default-5444994796-xhsv5\" (UID: \"ea2b4f76-5f61-410a-bb32-c1185e630c7a\") " pod="openshift-ingress/router-default-5444994796-xhsv5" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.808359 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.810541 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/89a368d8-7c06-47c3-9a5f-0c5cc5a5cd47-proxy-tls\") pod \"machine-config-operator-74547568cd-b4nfn\" (UID: \"89a368d8-7c06-47c3-9a5f-0c5cc5a5cd47\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b4nfn" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.829362 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.831476 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-ntz44" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.848817 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.868124 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.888125 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.918284 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.928308 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.948899 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.969446 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 18 11:58:24 crc kubenswrapper[4965]: I0318 11:58:24.988311 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.007866 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.026604 4965 request.go:700] Waited for 1.893851131s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/secrets?fieldSelector=metadata.name%3Ddefault-dockercfg-2llfx&limit=500&resourceVersion=0 Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.051818 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.051922 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.067994 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.108432 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f7011ee6-49c9-4b89-ae23-545bac0f68d0-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-wszrt\" (UID: \"f7011ee6-49c9-4b89-ae23-545bac0f68d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wszrt" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.120222 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ntz44"] Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.122581 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b1dc0d2-30e8-46f0-a933-cd321c32590b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sflt7\" (UID: \"5b1dc0d2-30e8-46f0-a933-cd321c32590b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sflt7" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.142338 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5t4x\" (UniqueName: \"kubernetes.io/projected/8b27563b-d715-4dfa-84b6-5d0f1d90e4b5-kube-api-access-n5t4x\") pod \"openshift-config-operator-7777fb866f-fs56m\" (UID: \"8b27563b-d715-4dfa-84b6-5d0f1d90e4b5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fs56m" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.162727 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g47l\" (UniqueName: \"kubernetes.io/projected/b0c9543d-7fc2-45cb-a925-f3c030a3c8cf-kube-api-access-2g47l\") pod \"cluster-samples-operator-665b6dd947-s2wpl\" (UID: \"b0c9543d-7fc2-45cb-a925-f3c030a3c8cf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s2wpl" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.181708 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55v4n\" (UniqueName: \"kubernetes.io/projected/3d252be2-ace2-40c4-a4e3-3824d9bffff4-kube-api-access-55v4n\") pod \"authentication-operator-69f744f599-k68bp\" (UID: \"3d252be2-ace2-40c4-a4e3-3824d9bffff4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k68bp" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.201075 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnzrx\" (UniqueName: \"kubernetes.io/projected/c3e59a9f-c0ea-4a0e-a39a-f332e3eba1a4-kube-api-access-mnzrx\") pod \"multus-admission-controller-857f4d67dd-fgg7z\" (UID: \"c3e59a9f-c0ea-4a0e-a39a-f332e3eba1a4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fgg7z" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.215755 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fs56m" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.229020 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-qtnkp"] Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.229826 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-pj8km"] Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.232199 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnqmf\" (UniqueName: \"kubernetes.io/projected/4a9a21f3-0936-44a9-b1f0-e82345703f0a-kube-api-access-wnqmf\") pod \"openshift-controller-manager-operator-756b6f6bc6-nqqmj\" (UID: \"4a9a21f3-0936-44a9-b1f0-e82345703f0a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nqqmj" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.235051 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s2wpl" Mar 18 11:58:25 crc kubenswrapper[4965]: W0318 11:58:25.238608 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00cf43a5_c3a1_4a4a_a49c_3707bb515d59.slice/crio-0acba46a8ba8a3fa65287d34ec44a2551e88e041149a84180b591c6256293224 WatchSource:0}: Error finding container 0acba46a8ba8a3fa65287d34ec44a2551e88e041149a84180b591c6256293224: Status 404 returned error can't find the container with id 0acba46a8ba8a3fa65287d34ec44a2551e88e041149a84180b591c6256293224 Mar 18 11:58:25 crc kubenswrapper[4965]: W0318 11:58:25.242831 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf0cad60_78bb_4325_aaaa_ee2636410fcb.slice/crio-3f5b20d902b5a93c0491a7505fd6efe0582d84896f5600ac47c7f24ee9b94826 WatchSource:0}: Error finding container 3f5b20d902b5a93c0491a7505fd6efe0582d84896f5600ac47c7f24ee9b94826: Status 404 returned error can't find the container with id 3f5b20d902b5a93c0491a7505fd6efe0582d84896f5600ac47c7f24ee9b94826 Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.244715 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-k68bp" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.245766 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wkgp\" (UniqueName: \"kubernetes.io/projected/8d01f52f-de93-4fa8-80c2-88b21f4e6400-kube-api-access-7wkgp\") pod \"packageserver-d55dfcdfc-dddvm\" (UID: \"8d01f52f-de93-4fa8-80c2-88b21f4e6400\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dddvm" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.265635 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2df2\" (UniqueName: \"kubernetes.io/projected/9e7d665e-a211-432c-8884-74096504ab5c-kube-api-access-h2df2\") pod \"machine-api-operator-5694c8668f-zptvt\" (UID: \"9e7d665e-a211-432c-8884-74096504ab5c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zptvt" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.284927 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67bv4\" (UniqueName: \"kubernetes.io/projected/89a368d8-7c06-47c3-9a5f-0c5cc5a5cd47-kube-api-access-67bv4\") pod \"machine-config-operator-74547568cd-b4nfn\" (UID: \"89a368d8-7c06-47c3-9a5f-0c5cc5a5cd47\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b4nfn" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.311336 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jddcd\" (UniqueName: \"kubernetes.io/projected/9b045370-3446-4fe4-b7c4-d0b12ddd06f4-kube-api-access-jddcd\") pod \"etcd-operator-b45778765-6w7zg\" (UID: \"9b045370-3446-4fe4-b7c4-d0b12ddd06f4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6w7zg" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.323233 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nqqmj" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.323443 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z95m\" (UniqueName: \"kubernetes.io/projected/6cfd1d38-a35c-47ab-962f-4403875e5a19-kube-api-access-7z95m\") pod \"apiserver-76f77b778f-mmzsn\" (UID: \"6cfd1d38-a35c-47ab-962f-4403875e5a19\") " pod="openshift-apiserver/apiserver-76f77b778f-mmzsn" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.330129 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-zptvt" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.347075 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5cjt\" (UniqueName: \"kubernetes.io/projected/2aaf7d7a-c623-4198-b68e-7efa895cb96f-kube-api-access-b5cjt\") pod \"collect-profiles-29563905-l6rnm\" (UID: \"2aaf7d7a-c623-4198-b68e-7efa895cb96f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563905-l6rnm" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.359471 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wszrt" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.367339 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm7tl\" (UniqueName: \"kubernetes.io/projected/081ced5c-b889-43c7-a7a5-bb6561452f4a-kube-api-access-wm7tl\") pod \"kube-storage-version-migrator-operator-b67b599dd-llx6v\" (UID: \"081ced5c-b889-43c7-a7a5-bb6561452f4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-llx6v" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.371578 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563905-l6rnm" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.381933 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-llx6v" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.384382 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48wdr\" (UniqueName: \"kubernetes.io/projected/ae68bf3c-11de-49ed-8fc2-a669cb8a04fa-kube-api-access-48wdr\") pod \"machine-approver-56656f9798-rwvcf\" (UID: \"ae68bf3c-11de-49ed-8fc2-a669cb8a04fa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rwvcf" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.397058 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dddvm" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.403431 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v994\" (UniqueName: \"kubernetes.io/projected/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-kube-api-access-4v994\") pod \"oauth-openshift-558db77b4-sk66r\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.404228 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sflt7" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.421166 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xsqf\" (UniqueName: \"kubernetes.io/projected/7a9e071f-3977-4a85-9b66-30c59efd7d3a-kube-api-access-8xsqf\") pod \"apiserver-7bbb656c7d-qdbck\" (UID: \"7a9e071f-3977-4a85-9b66-30c59efd7d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qdbck" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.437244 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-6w7zg" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.441023 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-fgg7z" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.446189 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8fnf\" (UniqueName: \"kubernetes.io/projected/9bb6cf60-3071-4f09-b9aa-e5d05b03804e-kube-api-access-b8fnf\") pod \"openshift-apiserver-operator-796bbdcf4f-wbdxj\" (UID: \"9bb6cf60-3071-4f09-b9aa-e5d05b03804e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wbdxj" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.462051 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b4nfn" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.464413 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s2wpl"] Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.465024 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbblb\" (UniqueName: \"kubernetes.io/projected/ea2b4f76-5f61-410a-bb32-c1185e630c7a-kube-api-access-jbblb\") pod \"router-default-5444994796-xhsv5\" (UID: \"ea2b4f76-5f61-410a-bb32-c1185e630c7a\") " pod="openshift-ingress/router-default-5444994796-xhsv5" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.470364 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qdbck" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.479548 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-k68bp"] Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.487460 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x98tf\" (UniqueName: \"kubernetes.io/projected/29015bb8-e604-425d-a88b-db3ec9e10096-kube-api-access-x98tf\") pod \"controller-manager-879f6c89f-sw5p7\" (UID: \"29015bb8-e604-425d-a88b-db3ec9e10096\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sw5p7" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.507496 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.508268 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnzl5\" (UniqueName: \"kubernetes.io/projected/2c56d894-8d33-4682-b8ea-4d99fc3bb1b7-kube-api-access-cnzl5\") pod \"route-controller-manager-6576b87f9c-5x9mq\" (UID: \"2c56d894-8d33-4682-b8ea-4d99fc3bb1b7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5x9mq" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.529891 4965 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.531032 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fs56m"] Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.534319 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-sw5p7" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.548234 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ntz44" event={"ID":"36b1bb7c-dba1-491c-84ba-820adbe96f3a","Type":"ContainerStarted","Data":"df53ab7b0386cd76ccffc27db55cdc1773521e8ce387221951e31c2d83e43ce7"} Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.548273 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ntz44" event={"ID":"36b1bb7c-dba1-491c-84ba-820adbe96f3a","Type":"ContainerStarted","Data":"77685da1bbb157b50c400a2e726f89579aa25c4762b43fab6d5ab3d6097406f4"} Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.550764 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.552640 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-qtnkp" event={"ID":"00cf43a5-c3a1-4a4a-a49c-3707bb515d59","Type":"ContainerStarted","Data":"e22637454aa14b64d0db5ef42d540d67c3cfb55eb6aff4bb94895699944e950f"} Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.552695 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-qtnkp" event={"ID":"00cf43a5-c3a1-4a4a-a49c-3707bb515d59","Type":"ContainerStarted","Data":"0acba46a8ba8a3fa65287d34ec44a2551e88e041149a84180b591c6256293224"} Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.553059 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-qtnkp" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.554004 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-k68bp" event={"ID":"3d252be2-ace2-40c4-a4e3-3824d9bffff4","Type":"ContainerStarted","Data":"c577476254ca56a50316514942837e56f58d26136eff8a514956af8f6e07751e"} Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.558096 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-pj8km" event={"ID":"bf0cad60-78bb-4325-aaaa-ee2636410fcb","Type":"ContainerStarted","Data":"daf3c3420d9913da8f7abfc27f0b2ae5abd4b2b2648d768a75965436e5b0b622"} Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.558123 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-pj8km" event={"ID":"bf0cad60-78bb-4325-aaaa-ee2636410fcb","Type":"ContainerStarted","Data":"3f5b20d902b5a93c0491a7505fd6efe0582d84896f5600ac47c7f24ee9b94826"} Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.562145 4965 patch_prober.go:28] interesting pod/console-operator-58897d9998-qtnkp container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.562197 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-qtnkp" podUID="00cf43a5-c3a1-4a4a-a49c-3707bb515d59" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.578027 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-mmzsn" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.580767 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nqqmj"] Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.580888 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wbdxj" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.590806 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 18 11:58:25 crc kubenswrapper[4965]: W0318 11:58:25.600960 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a9a21f3_0936_44a9_b1f0_e82345703f0a.slice/crio-b8391859c9bdcacaa7aa45e9fe88f503f54c82e9cc659fcb98f7e066208ee525 WatchSource:0}: Error finding container b8391859c9bdcacaa7aa45e9fe88f503f54c82e9cc659fcb98f7e066208ee525: Status 404 returned error can't find the container with id b8391859c9bdcacaa7aa45e9fe88f503f54c82e9cc659fcb98f7e066208ee525 Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.614175 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.639094 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rwvcf" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.644948 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.649023 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wszrt"] Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.652494 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5x9mq" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.714114 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f3fbc5c-1a5c-408c-acc0-648113e6e9bc-bound-sa-token\") pod \"ingress-operator-5b745b69d9-r8xpr\" (UID: \"8f3fbc5c-1a5c-408c-acc0-648113e6e9bc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8xpr" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.714397 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f98952eb-6e5b-4114-abf8-f440a0c97b76-signing-key\") pod \"service-ca-9c57cc56f-4sskq\" (UID: \"f98952eb-6e5b-4114-abf8-f440a0c97b76\") " pod="openshift-service-ca/service-ca-9c57cc56f-4sskq" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.714425 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f98952eb-6e5b-4114-abf8-f440a0c97b76-signing-cabundle\") pod \"service-ca-9c57cc56f-4sskq\" (UID: \"f98952eb-6e5b-4114-abf8-f440a0c97b76\") " pod="openshift-service-ca/service-ca-9c57cc56f-4sskq" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.714444 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad0ada2e-efc3-4a75-b954-15e8b893e2bf-bound-sa-token\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.714464 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ad0ada2e-efc3-4a75-b954-15e8b893e2bf-registry-certificates\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.714490 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.714517 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ad0ada2e-efc3-4a75-b954-15e8b893e2bf-ca-trust-extracted\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.714537 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/10476ddd-0e25-4297-8e49-ae3cb9d2d0c4-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-wg6wd\" (UID: \"10476ddd-0e25-4297-8e49-ae3cb9d2d0c4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wg6wd" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.714553 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/10476ddd-0e25-4297-8e49-ae3cb9d2d0c4-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-wg6wd\" (UID: \"10476ddd-0e25-4297-8e49-ae3cb9d2d0c4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wg6wd" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.714572 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbg56\" (UniqueName: \"kubernetes.io/projected/8f3fbc5c-1a5c-408c-acc0-648113e6e9bc-kube-api-access-vbg56\") pod \"ingress-operator-5b745b69d9-r8xpr\" (UID: \"8f3fbc5c-1a5c-408c-acc0-648113e6e9bc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8xpr" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.714589 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/10476ddd-0e25-4297-8e49-ae3cb9d2d0c4-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-wg6wd\" (UID: \"10476ddd-0e25-4297-8e49-ae3cb9d2d0c4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wg6wd" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.714607 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hml7\" (UniqueName: \"kubernetes.io/projected/f98952eb-6e5b-4114-abf8-f440a0c97b76-kube-api-access-8hml7\") pod \"service-ca-9c57cc56f-4sskq\" (UID: \"f98952eb-6e5b-4114-abf8-f440a0c97b76\") " pod="openshift-service-ca/service-ca-9c57cc56f-4sskq" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.714623 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad0ada2e-efc3-4a75-b954-15e8b893e2bf-registry-tls\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.714646 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/beb408f9-638c-4bd9-b4ec-c72bc43286e3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8pdbx\" (UID: \"beb408f9-638c-4bd9-b4ec-c72bc43286e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-8pdbx" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.714674 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/beb408f9-638c-4bd9-b4ec-c72bc43286e3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8pdbx\" (UID: \"beb408f9-638c-4bd9-b4ec-c72bc43286e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-8pdbx" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.714691 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgcc8\" (UniqueName: \"kubernetes.io/projected/10476ddd-0e25-4297-8e49-ae3cb9d2d0c4-kube-api-access-wgcc8\") pod \"cluster-image-registry-operator-dc59b4c8b-wg6wd\" (UID: \"10476ddd-0e25-4297-8e49-ae3cb9d2d0c4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wg6wd" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.714709 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fb8ab568-8d37-4899-9337-600b4f41dfe5-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-q4h66\" (UID: \"fb8ab568-8d37-4899-9337-600b4f41dfe5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-q4h66" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.714727 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad0ada2e-efc3-4a75-b954-15e8b893e2bf-trusted-ca\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.714757 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ad0ada2e-efc3-4a75-b954-15e8b893e2bf-installation-pull-secrets\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.714777 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb8ab568-8d37-4899-9337-600b4f41dfe5-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-q4h66\" (UID: \"fb8ab568-8d37-4899-9337-600b4f41dfe5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-q4h66" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.714801 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb8ab568-8d37-4899-9337-600b4f41dfe5-config\") pod \"kube-controller-manager-operator-78b949d7b-q4h66\" (UID: \"fb8ab568-8d37-4899-9337-600b4f41dfe5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-q4h66" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.714816 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64htn\" (UniqueName: \"kubernetes.io/projected/18b96e51-1a8b-4947-a9a3-9af0c0e88100-kube-api-access-64htn\") pod \"downloads-7954f5f757-q9zxt\" (UID: \"18b96e51-1a8b-4947-a9a3-9af0c0e88100\") " pod="openshift-console/downloads-7954f5f757-q9zxt" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.714835 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f3fbc5c-1a5c-408c-acc0-648113e6e9bc-trusted-ca\") pod \"ingress-operator-5b745b69d9-r8xpr\" (UID: \"8f3fbc5c-1a5c-408c-acc0-648113e6e9bc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8xpr" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.714863 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwm5l\" (UniqueName: \"kubernetes.io/projected/beb408f9-638c-4bd9-b4ec-c72bc43286e3-kube-api-access-dwm5l\") pod \"marketplace-operator-79b997595-8pdbx\" (UID: \"beb408f9-638c-4bd9-b4ec-c72bc43286e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-8pdbx" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.714879 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfnlh\" (UniqueName: \"kubernetes.io/projected/ad0ada2e-efc3-4a75-b954-15e8b893e2bf-kube-api-access-vfnlh\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.714897 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8f3fbc5c-1a5c-408c-acc0-648113e6e9bc-metrics-tls\") pod \"ingress-operator-5b745b69d9-r8xpr\" (UID: \"8f3fbc5c-1a5c-408c-acc0-648113e6e9bc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8xpr" Mar 18 11:58:25 crc kubenswrapper[4965]: E0318 11:58:25.715194 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 11:58:26.215182054 +0000 UTC m=+111.201369533 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmt4c" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.721378 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-qtnkp" podStartSLOduration=49.721359865 podStartE2EDuration="49.721359865s" podCreationTimestamp="2026-03-18 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:25.71972648 +0000 UTC m=+110.705913959" watchObservedRunningTime="2026-03-18 11:58:25.721359865 +0000 UTC m=+110.707547344" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.722598 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-xhsv5" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.779173 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-zptvt"] Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.815335 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:25 crc kubenswrapper[4965]: E0318 11:58:25.815459 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:26.315442427 +0000 UTC m=+111.301629906 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.815565 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm28w\" (UniqueName: \"kubernetes.io/projected/ccc8d88d-863a-43b2-916d-b9df5a38453d-kube-api-access-cm28w\") pod \"cni-sysctl-allowlist-ds-5scj2\" (UID: \"ccc8d88d-863a-43b2-916d-b9df5a38453d\") " pod="openshift-multus/cni-sysctl-allowlist-ds-5scj2" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.815645 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f3fbc5c-1a5c-408c-acc0-648113e6e9bc-bound-sa-token\") pod \"ingress-operator-5b745b69d9-r8xpr\" (UID: \"8f3fbc5c-1a5c-408c-acc0-648113e6e9bc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8xpr" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.815678 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n57b4\" (UniqueName: \"kubernetes.io/projected/01c95b1b-48b1-47a7-9da0-7e6c8ced735a-kube-api-access-n57b4\") pod \"machine-config-server-cmx7r\" (UID: \"01c95b1b-48b1-47a7-9da0-7e6c8ced735a\") " pod="openshift-machine-config-operator/machine-config-server-cmx7r" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.815710 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f9a78ea-4484-46dd-ab80-ac76fc089420-config\") pod \"service-ca-operator-777779d784-jvz5t\" (UID: \"7f9a78ea-4484-46dd-ab80-ac76fc089420\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jvz5t" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.815756 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jcl7\" (UniqueName: \"kubernetes.io/projected/c66fcc55-a5d5-4886-afa8-33dd5be2f631-kube-api-access-8jcl7\") pod \"olm-operator-6b444d44fb-cv5z6\" (UID: \"c66fcc55-a5d5-4886-afa8-33dd5be2f631\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cv5z6" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.815772 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wfdq\" (UniqueName: \"kubernetes.io/projected/ab0e8f3c-d83a-4b3a-aee7-f530f82761f8-kube-api-access-9wfdq\") pod \"ingress-canary-4xx64\" (UID: \"ab0e8f3c-d83a-4b3a-aee7-f530f82761f8\") " pod="openshift-ingress-canary/ingress-canary-4xx64" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.816283 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/19d4f554-489c-46b5-aba4-919ab05d8853-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-bksfk\" (UID: \"19d4f554-489c-46b5-aba4-919ab05d8853\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bksfk" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.816350 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdcrt\" (UniqueName: \"kubernetes.io/projected/19d4f554-489c-46b5-aba4-919ab05d8853-kube-api-access-vdcrt\") pod \"package-server-manager-789f6589d5-bksfk\" (UID: \"19d4f554-489c-46b5-aba4-919ab05d8853\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bksfk" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.816380 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f98952eb-6e5b-4114-abf8-f440a0c97b76-signing-key\") pod \"service-ca-9c57cc56f-4sskq\" (UID: \"f98952eb-6e5b-4114-abf8-f440a0c97b76\") " pod="openshift-service-ca/service-ca-9c57cc56f-4sskq" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.816455 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f98952eb-6e5b-4114-abf8-f440a0c97b76-signing-cabundle\") pod \"service-ca-9c57cc56f-4sskq\" (UID: \"f98952eb-6e5b-4114-abf8-f440a0c97b76\") " pod="openshift-service-ca/service-ca-9c57cc56f-4sskq" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.816481 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22snl\" (UniqueName: \"kubernetes.io/projected/ea7fccc2-5160-43d1-bbfa-7c6905d4a0d8-kube-api-access-22snl\") pod \"dns-default-w9kwc\" (UID: \"ea7fccc2-5160-43d1-bbfa-7c6905d4a0d8\") " pod="openshift-dns/dns-default-w9kwc" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.816537 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ccc8d88d-863a-43b2-916d-b9df5a38453d-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-5scj2\" (UID: \"ccc8d88d-863a-43b2-916d-b9df5a38453d\") " pod="openshift-multus/cni-sysctl-allowlist-ds-5scj2" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.816563 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad0ada2e-efc3-4a75-b954-15e8b893e2bf-bound-sa-token\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.828891 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcw4z\" (UniqueName: \"kubernetes.io/projected/98ebf2d2-03be-474a-a318-0e91e9164758-kube-api-access-dcw4z\") pod \"catalog-operator-68c6474976-nb2r4\" (UID: \"98ebf2d2-03be-474a-a318-0e91e9164758\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nb2r4" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.829143 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f9a78ea-4484-46dd-ab80-ac76fc089420-serving-cert\") pod \"service-ca-operator-777779d784-jvz5t\" (UID: \"7f9a78ea-4484-46dd-ab80-ac76fc089420\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jvz5t" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.829176 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/98ebf2d2-03be-474a-a318-0e91e9164758-profile-collector-cert\") pod \"catalog-operator-68c6474976-nb2r4\" (UID: \"98ebf2d2-03be-474a-a318-0e91e9164758\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nb2r4" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.829351 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ad0ada2e-efc3-4a75-b954-15e8b893e2bf-registry-certificates\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.829423 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f98952eb-6e5b-4114-abf8-f440a0c97b76-signing-cabundle\") pod \"service-ca-9c57cc56f-4sskq\" (UID: \"f98952eb-6e5b-4114-abf8-f440a0c97b76\") " pod="openshift-service-ca/service-ca-9c57cc56f-4sskq" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.829517 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.829647 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ad0ada2e-efc3-4a75-b954-15e8b893e2bf-ca-trust-extracted\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.829702 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/10476ddd-0e25-4297-8e49-ae3cb9d2d0c4-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-wg6wd\" (UID: \"10476ddd-0e25-4297-8e49-ae3cb9d2d0c4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wg6wd" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.829822 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbg56\" (UniqueName: \"kubernetes.io/projected/8f3fbc5c-1a5c-408c-acc0-648113e6e9bc-kube-api-access-vbg56\") pod \"ingress-operator-5b745b69d9-r8xpr\" (UID: \"8f3fbc5c-1a5c-408c-acc0-648113e6e9bc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8xpr" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.829854 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/10476ddd-0e25-4297-8e49-ae3cb9d2d0c4-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-wg6wd\" (UID: \"10476ddd-0e25-4297-8e49-ae3cb9d2d0c4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wg6wd" Mar 18 11:58:25 crc kubenswrapper[4965]: E0318 11:58:25.829866 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 11:58:26.329852634 +0000 UTC m=+111.316040113 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmt4c" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.829896 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/10476ddd-0e25-4297-8e49-ae3cb9d2d0c4-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-wg6wd\" (UID: \"10476ddd-0e25-4297-8e49-ae3cb9d2d0c4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wg6wd" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.829920 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/047424fe-da2d-4da3-a8b4-ff9bc5ae743f-csi-data-dir\") pod \"csi-hostpathplugin-xjqrf\" (UID: \"047424fe-da2d-4da3-a8b4-ff9bc5ae743f\") " pod="hostpath-provisioner/csi-hostpathplugin-xjqrf" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.829938 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ea7fccc2-5160-43d1-bbfa-7c6905d4a0d8-metrics-tls\") pod \"dns-default-w9kwc\" (UID: \"ea7fccc2-5160-43d1-bbfa-7c6905d4a0d8\") " pod="openshift-dns/dns-default-w9kwc" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.830001 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hml7\" (UniqueName: \"kubernetes.io/projected/f98952eb-6e5b-4114-abf8-f440a0c97b76-kube-api-access-8hml7\") pod \"service-ca-9c57cc56f-4sskq\" (UID: \"f98952eb-6e5b-4114-abf8-f440a0c97b76\") " pod="openshift-service-ca/service-ca-9c57cc56f-4sskq" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.830454 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/047424fe-da2d-4da3-a8b4-ff9bc5ae743f-mountpoint-dir\") pod \"csi-hostpathplugin-xjqrf\" (UID: \"047424fe-da2d-4da3-a8b4-ff9bc5ae743f\") " pod="hostpath-provisioner/csi-hostpathplugin-xjqrf" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.830493 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad0ada2e-efc3-4a75-b954-15e8b893e2bf-registry-tls\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.830511 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmmxw\" (UniqueName: \"kubernetes.io/projected/fefb79e0-1f20-4a4f-bdf2-f3bad9d2a7c3-kube-api-access-vmmxw\") pod \"control-plane-machine-set-operator-78cbb6b69f-vfp9m\" (UID: \"fefb79e0-1f20-4a4f-bdf2-f3bad9d2a7c3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vfp9m" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.830573 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/ccc8d88d-863a-43b2-916d-b9df5a38453d-ready\") pod \"cni-sysctl-allowlist-ds-5scj2\" (UID: \"ccc8d88d-863a-43b2-916d-b9df5a38453d\") " pod="openshift-multus/cni-sysctl-allowlist-ds-5scj2" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.830599 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/01c95b1b-48b1-47a7-9da0-7e6c8ced735a-certs\") pod \"machine-config-server-cmx7r\" (UID: \"01c95b1b-48b1-47a7-9da0-7e6c8ced735a\") " pod="openshift-machine-config-operator/machine-config-server-cmx7r" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.830627 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/beb408f9-638c-4bd9-b4ec-c72bc43286e3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8pdbx\" (UID: \"beb408f9-638c-4bd9-b4ec-c72bc43286e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-8pdbx" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.830644 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/98ebf2d2-03be-474a-a318-0e91e9164758-srv-cert\") pod \"catalog-operator-68c6474976-nb2r4\" (UID: \"98ebf2d2-03be-474a-a318-0e91e9164758\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nb2r4" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.830699 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/beb408f9-638c-4bd9-b4ec-c72bc43286e3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8pdbx\" (UID: \"beb408f9-638c-4bd9-b4ec-c72bc43286e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-8pdbx" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.830912 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgcc8\" (UniqueName: \"kubernetes.io/projected/10476ddd-0e25-4297-8e49-ae3cb9d2d0c4-kube-api-access-wgcc8\") pod \"cluster-image-registry-operator-dc59b4c8b-wg6wd\" (UID: \"10476ddd-0e25-4297-8e49-ae3cb9d2d0c4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wg6wd" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.830936 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c66fcc55-a5d5-4886-afa8-33dd5be2f631-profile-collector-cert\") pod \"olm-operator-6b444d44fb-cv5z6\" (UID: \"c66fcc55-a5d5-4886-afa8-33dd5be2f631\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cv5z6" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.830970 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fb8ab568-8d37-4899-9337-600b4f41dfe5-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-q4h66\" (UID: \"fb8ab568-8d37-4899-9337-600b4f41dfe5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-q4h66" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.833523 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ad0ada2e-efc3-4a75-b954-15e8b893e2bf-installation-pull-secrets\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.833996 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ea7fccc2-5160-43d1-bbfa-7c6905d4a0d8-config-volume\") pod \"dns-default-w9kwc\" (UID: \"ea7fccc2-5160-43d1-bbfa-7c6905d4a0d8\") " pod="openshift-dns/dns-default-w9kwc" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.834068 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ab0e8f3c-d83a-4b3a-aee7-f530f82761f8-cert\") pod \"ingress-canary-4xx64\" (UID: \"ab0e8f3c-d83a-4b3a-aee7-f530f82761f8\") " pod="openshift-ingress-canary/ingress-canary-4xx64" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.834160 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad0ada2e-efc3-4a75-b954-15e8b893e2bf-trusted-ca\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.834336 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb8ab568-8d37-4899-9337-600b4f41dfe5-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-q4h66\" (UID: \"fb8ab568-8d37-4899-9337-600b4f41dfe5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-q4h66" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.834908 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/01c95b1b-48b1-47a7-9da0-7e6c8ced735a-node-bootstrap-token\") pod \"machine-config-server-cmx7r\" (UID: \"01c95b1b-48b1-47a7-9da0-7e6c8ced735a\") " pod="openshift-machine-config-operator/machine-config-server-cmx7r" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.835571 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flzcg\" (UniqueName: \"kubernetes.io/projected/6c073d0e-ac3a-4cb8-a611-8e5e0f99c3e9-kube-api-access-flzcg\") pod \"machine-config-controller-84d6567774-kwbct\" (UID: \"6c073d0e-ac3a-4cb8-a611-8e5e0f99c3e9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kwbct" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.837680 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/10476ddd-0e25-4297-8e49-ae3cb9d2d0c4-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-wg6wd\" (UID: \"10476ddd-0e25-4297-8e49-ae3cb9d2d0c4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wg6wd" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.838016 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/beb408f9-638c-4bd9-b4ec-c72bc43286e3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8pdbx\" (UID: \"beb408f9-638c-4bd9-b4ec-c72bc43286e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-8pdbx" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.838236 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ad0ada2e-efc3-4a75-b954-15e8b893e2bf-registry-certificates\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.838565 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/047424fe-da2d-4da3-a8b4-ff9bc5ae743f-plugins-dir\") pod \"csi-hostpathplugin-xjqrf\" (UID: \"047424fe-da2d-4da3-a8b4-ff9bc5ae743f\") " pod="hostpath-provisioner/csi-hostpathplugin-xjqrf" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.838644 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ad0ada2e-efc3-4a75-b954-15e8b893e2bf-ca-trust-extracted\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.838688 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb8ab568-8d37-4899-9337-600b4f41dfe5-config\") pod \"kube-controller-manager-operator-78b949d7b-q4h66\" (UID: \"fb8ab568-8d37-4899-9337-600b4f41dfe5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-q4h66" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.838760 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64htn\" (UniqueName: \"kubernetes.io/projected/18b96e51-1a8b-4947-a9a3-9af0c0e88100-kube-api-access-64htn\") pod \"downloads-7954f5f757-q9zxt\" (UID: \"18b96e51-1a8b-4947-a9a3-9af0c0e88100\") " pod="openshift-console/downloads-7954f5f757-q9zxt" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.838828 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f3fbc5c-1a5c-408c-acc0-648113e6e9bc-trusted-ca\") pod \"ingress-operator-5b745b69d9-r8xpr\" (UID: \"8f3fbc5c-1a5c-408c-acc0-648113e6e9bc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8xpr" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.840844 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad0ada2e-efc3-4a75-b954-15e8b893e2bf-trusted-ca\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.842010 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f3fbc5c-1a5c-408c-acc0-648113e6e9bc-trusted-ca\") pod \"ingress-operator-5b745b69d9-r8xpr\" (UID: \"8f3fbc5c-1a5c-408c-acc0-648113e6e9bc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8xpr" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.843371 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/047424fe-da2d-4da3-a8b4-ff9bc5ae743f-socket-dir\") pod \"csi-hostpathplugin-xjqrf\" (UID: \"047424fe-da2d-4da3-a8b4-ff9bc5ae743f\") " pod="hostpath-provisioner/csi-hostpathplugin-xjqrf" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.843458 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4bbq\" (UniqueName: \"kubernetes.io/projected/047424fe-da2d-4da3-a8b4-ff9bc5ae743f-kube-api-access-l4bbq\") pod \"csi-hostpathplugin-xjqrf\" (UID: \"047424fe-da2d-4da3-a8b4-ff9bc5ae743f\") " pod="hostpath-provisioner/csi-hostpathplugin-xjqrf" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.844193 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klczw\" (UniqueName: \"kubernetes.io/projected/e712d1d3-e146-41c2-a5ef-610e4bb8771c-kube-api-access-klczw\") pod \"migrator-59844c95c7-2dm7f\" (UID: \"e712d1d3-e146-41c2-a5ef-610e4bb8771c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2dm7f" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.844241 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ccc8d88d-863a-43b2-916d-b9df5a38453d-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-5scj2\" (UID: \"ccc8d88d-863a-43b2-916d-b9df5a38453d\") " pod="openshift-multus/cni-sysctl-allowlist-ds-5scj2" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.844356 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/047424fe-da2d-4da3-a8b4-ff9bc5ae743f-registration-dir\") pod \"csi-hostpathplugin-xjqrf\" (UID: \"047424fe-da2d-4da3-a8b4-ff9bc5ae743f\") " pod="hostpath-provisioner/csi-hostpathplugin-xjqrf" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.844582 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwm5l\" (UniqueName: \"kubernetes.io/projected/beb408f9-638c-4bd9-b4ec-c72bc43286e3-kube-api-access-dwm5l\") pod \"marketplace-operator-79b997595-8pdbx\" (UID: \"beb408f9-638c-4bd9-b4ec-c72bc43286e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-8pdbx" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.844616 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/fefb79e0-1f20-4a4f-bdf2-f3bad9d2a7c3-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-vfp9m\" (UID: \"fefb79e0-1f20-4a4f-bdf2-f3bad9d2a7c3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vfp9m" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.844687 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c66fcc55-a5d5-4886-afa8-33dd5be2f631-srv-cert\") pod \"olm-operator-6b444d44fb-cv5z6\" (UID: \"c66fcc55-a5d5-4886-afa8-33dd5be2f631\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cv5z6" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.844770 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfnlh\" (UniqueName: \"kubernetes.io/projected/ad0ada2e-efc3-4a75-b954-15e8b893e2bf-kube-api-access-vfnlh\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.844950 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6c073d0e-ac3a-4cb8-a611-8e5e0f99c3e9-proxy-tls\") pod \"machine-config-controller-84d6567774-kwbct\" (UID: \"6c073d0e-ac3a-4cb8-a611-8e5e0f99c3e9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kwbct" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.845073 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8f3fbc5c-1a5c-408c-acc0-648113e6e9bc-metrics-tls\") pod \"ingress-operator-5b745b69d9-r8xpr\" (UID: \"8f3fbc5c-1a5c-408c-acc0-648113e6e9bc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8xpr" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.845101 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6c073d0e-ac3a-4cb8-a611-8e5e0f99c3e9-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-kwbct\" (UID: \"6c073d0e-ac3a-4cb8-a611-8e5e0f99c3e9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kwbct" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.845147 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqs42\" (UniqueName: \"kubernetes.io/projected/7f9a78ea-4484-46dd-ab80-ac76fc089420-kube-api-access-mqs42\") pod \"service-ca-operator-777779d784-jvz5t\" (UID: \"7f9a78ea-4484-46dd-ab80-ac76fc089420\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jvz5t" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.851157 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb8ab568-8d37-4899-9337-600b4f41dfe5-config\") pod \"kube-controller-manager-operator-78b949d7b-q4h66\" (UID: \"fb8ab568-8d37-4899-9337-600b4f41dfe5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-q4h66" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.853164 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/10476ddd-0e25-4297-8e49-ae3cb9d2d0c4-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-wg6wd\" (UID: \"10476ddd-0e25-4297-8e49-ae3cb9d2d0c4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wg6wd" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.853703 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f98952eb-6e5b-4114-abf8-f440a0c97b76-signing-key\") pod \"service-ca-9c57cc56f-4sskq\" (UID: \"f98952eb-6e5b-4114-abf8-f440a0c97b76\") " pod="openshift-service-ca/service-ca-9c57cc56f-4sskq" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.853855 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb8ab568-8d37-4899-9337-600b4f41dfe5-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-q4h66\" (UID: \"fb8ab568-8d37-4899-9337-600b4f41dfe5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-q4h66" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.860326 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad0ada2e-efc3-4a75-b954-15e8b893e2bf-registry-tls\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.867899 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8f3fbc5c-1a5c-408c-acc0-648113e6e9bc-metrics-tls\") pod \"ingress-operator-5b745b69d9-r8xpr\" (UID: \"8f3fbc5c-1a5c-408c-acc0-648113e6e9bc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8xpr" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.881042 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad0ada2e-efc3-4a75-b954-15e8b893e2bf-bound-sa-token\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.883049 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/beb408f9-638c-4bd9-b4ec-c72bc43286e3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8pdbx\" (UID: \"beb408f9-638c-4bd9-b4ec-c72bc43286e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-8pdbx" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.904873 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ad0ada2e-efc3-4a75-b954-15e8b893e2bf-installation-pull-secrets\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.908522 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f3fbc5c-1a5c-408c-acc0-648113e6e9bc-bound-sa-token\") pod \"ingress-operator-5b745b69d9-r8xpr\" (UID: \"8f3fbc5c-1a5c-408c-acc0-648113e6e9bc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8xpr" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.915318 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/10476ddd-0e25-4297-8e49-ae3cb9d2d0c4-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-wg6wd\" (UID: \"10476ddd-0e25-4297-8e49-ae3cb9d2d0c4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wg6wd" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.924131 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hml7\" (UniqueName: \"kubernetes.io/projected/f98952eb-6e5b-4114-abf8-f440a0c97b76-kube-api-access-8hml7\") pod \"service-ca-9c57cc56f-4sskq\" (UID: \"f98952eb-6e5b-4114-abf8-f440a0c97b76\") " pod="openshift-service-ca/service-ca-9c57cc56f-4sskq" Mar 18 11:58:25 crc kubenswrapper[4965]: E0318 11:58:25.946346 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:26.446297432 +0000 UTC m=+111.432484911 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.945806 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.946893 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c66fcc55-a5d5-4886-afa8-33dd5be2f631-profile-collector-cert\") pod \"olm-operator-6b444d44fb-cv5z6\" (UID: \"c66fcc55-a5d5-4886-afa8-33dd5be2f631\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cv5z6" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.946930 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ea7fccc2-5160-43d1-bbfa-7c6905d4a0d8-config-volume\") pod \"dns-default-w9kwc\" (UID: \"ea7fccc2-5160-43d1-bbfa-7c6905d4a0d8\") " pod="openshift-dns/dns-default-w9kwc" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.946946 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ab0e8f3c-d83a-4b3a-aee7-f530f82761f8-cert\") pod \"ingress-canary-4xx64\" (UID: \"ab0e8f3c-d83a-4b3a-aee7-f530f82761f8\") " pod="openshift-ingress-canary/ingress-canary-4xx64" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.946971 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/01c95b1b-48b1-47a7-9da0-7e6c8ced735a-node-bootstrap-token\") pod \"machine-config-server-cmx7r\" (UID: \"01c95b1b-48b1-47a7-9da0-7e6c8ced735a\") " pod="openshift-machine-config-operator/machine-config-server-cmx7r" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.946993 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flzcg\" (UniqueName: \"kubernetes.io/projected/6c073d0e-ac3a-4cb8-a611-8e5e0f99c3e9-kube-api-access-flzcg\") pod \"machine-config-controller-84d6567774-kwbct\" (UID: \"6c073d0e-ac3a-4cb8-a611-8e5e0f99c3e9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kwbct" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.947011 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/047424fe-da2d-4da3-a8b4-ff9bc5ae743f-plugins-dir\") pod \"csi-hostpathplugin-xjqrf\" (UID: \"047424fe-da2d-4da3-a8b4-ff9bc5ae743f\") " pod="hostpath-provisioner/csi-hostpathplugin-xjqrf" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.947053 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/047424fe-da2d-4da3-a8b4-ff9bc5ae743f-socket-dir\") pod \"csi-hostpathplugin-xjqrf\" (UID: \"047424fe-da2d-4da3-a8b4-ff9bc5ae743f\") " pod="hostpath-provisioner/csi-hostpathplugin-xjqrf" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.947079 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4bbq\" (UniqueName: \"kubernetes.io/projected/047424fe-da2d-4da3-a8b4-ff9bc5ae743f-kube-api-access-l4bbq\") pod \"csi-hostpathplugin-xjqrf\" (UID: \"047424fe-da2d-4da3-a8b4-ff9bc5ae743f\") " pod="hostpath-provisioner/csi-hostpathplugin-xjqrf" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.947096 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klczw\" (UniqueName: \"kubernetes.io/projected/e712d1d3-e146-41c2-a5ef-610e4bb8771c-kube-api-access-klczw\") pod \"migrator-59844c95c7-2dm7f\" (UID: \"e712d1d3-e146-41c2-a5ef-610e4bb8771c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2dm7f" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.947112 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ccc8d88d-863a-43b2-916d-b9df5a38453d-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-5scj2\" (UID: \"ccc8d88d-863a-43b2-916d-b9df5a38453d\") " pod="openshift-multus/cni-sysctl-allowlist-ds-5scj2" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.947129 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/047424fe-da2d-4da3-a8b4-ff9bc5ae743f-registration-dir\") pod \"csi-hostpathplugin-xjqrf\" (UID: \"047424fe-da2d-4da3-a8b4-ff9bc5ae743f\") " pod="hostpath-provisioner/csi-hostpathplugin-xjqrf" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.947154 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/fefb79e0-1f20-4a4f-bdf2-f3bad9d2a7c3-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-vfp9m\" (UID: \"fefb79e0-1f20-4a4f-bdf2-f3bad9d2a7c3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vfp9m" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.947173 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c66fcc55-a5d5-4886-afa8-33dd5be2f631-srv-cert\") pod \"olm-operator-6b444d44fb-cv5z6\" (UID: \"c66fcc55-a5d5-4886-afa8-33dd5be2f631\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cv5z6" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.947196 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6c073d0e-ac3a-4cb8-a611-8e5e0f99c3e9-proxy-tls\") pod \"machine-config-controller-84d6567774-kwbct\" (UID: \"6c073d0e-ac3a-4cb8-a611-8e5e0f99c3e9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kwbct" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.947214 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6c073d0e-ac3a-4cb8-a611-8e5e0f99c3e9-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-kwbct\" (UID: \"6c073d0e-ac3a-4cb8-a611-8e5e0f99c3e9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kwbct" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.947231 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqs42\" (UniqueName: \"kubernetes.io/projected/7f9a78ea-4484-46dd-ab80-ac76fc089420-kube-api-access-mqs42\") pod \"service-ca-operator-777779d784-jvz5t\" (UID: \"7f9a78ea-4484-46dd-ab80-ac76fc089420\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jvz5t" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.947250 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm28w\" (UniqueName: \"kubernetes.io/projected/ccc8d88d-863a-43b2-916d-b9df5a38453d-kube-api-access-cm28w\") pod \"cni-sysctl-allowlist-ds-5scj2\" (UID: \"ccc8d88d-863a-43b2-916d-b9df5a38453d\") " pod="openshift-multus/cni-sysctl-allowlist-ds-5scj2" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.947270 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n57b4\" (UniqueName: \"kubernetes.io/projected/01c95b1b-48b1-47a7-9da0-7e6c8ced735a-kube-api-access-n57b4\") pod \"machine-config-server-cmx7r\" (UID: \"01c95b1b-48b1-47a7-9da0-7e6c8ced735a\") " pod="openshift-machine-config-operator/machine-config-server-cmx7r" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.947286 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f9a78ea-4484-46dd-ab80-ac76fc089420-config\") pod \"service-ca-operator-777779d784-jvz5t\" (UID: \"7f9a78ea-4484-46dd-ab80-ac76fc089420\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jvz5t" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.947304 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jcl7\" (UniqueName: \"kubernetes.io/projected/c66fcc55-a5d5-4886-afa8-33dd5be2f631-kube-api-access-8jcl7\") pod \"olm-operator-6b444d44fb-cv5z6\" (UID: \"c66fcc55-a5d5-4886-afa8-33dd5be2f631\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cv5z6" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.947331 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wfdq\" (UniqueName: \"kubernetes.io/projected/ab0e8f3c-d83a-4b3a-aee7-f530f82761f8-kube-api-access-9wfdq\") pod \"ingress-canary-4xx64\" (UID: \"ab0e8f3c-d83a-4b3a-aee7-f530f82761f8\") " pod="openshift-ingress-canary/ingress-canary-4xx64" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.947351 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/19d4f554-489c-46b5-aba4-919ab05d8853-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-bksfk\" (UID: \"19d4f554-489c-46b5-aba4-919ab05d8853\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bksfk" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.947366 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdcrt\" (UniqueName: \"kubernetes.io/projected/19d4f554-489c-46b5-aba4-919ab05d8853-kube-api-access-vdcrt\") pod \"package-server-manager-789f6589d5-bksfk\" (UID: \"19d4f554-489c-46b5-aba4-919ab05d8853\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bksfk" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.947392 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22snl\" (UniqueName: \"kubernetes.io/projected/ea7fccc2-5160-43d1-bbfa-7c6905d4a0d8-kube-api-access-22snl\") pod \"dns-default-w9kwc\" (UID: \"ea7fccc2-5160-43d1-bbfa-7c6905d4a0d8\") " pod="openshift-dns/dns-default-w9kwc" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.947410 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ccc8d88d-863a-43b2-916d-b9df5a38453d-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-5scj2\" (UID: \"ccc8d88d-863a-43b2-916d-b9df5a38453d\") " pod="openshift-multus/cni-sysctl-allowlist-ds-5scj2" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.947429 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcw4z\" (UniqueName: \"kubernetes.io/projected/98ebf2d2-03be-474a-a318-0e91e9164758-kube-api-access-dcw4z\") pod \"catalog-operator-68c6474976-nb2r4\" (UID: \"98ebf2d2-03be-474a-a318-0e91e9164758\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nb2r4" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.947447 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f9a78ea-4484-46dd-ab80-ac76fc089420-serving-cert\") pod \"service-ca-operator-777779d784-jvz5t\" (UID: \"7f9a78ea-4484-46dd-ab80-ac76fc089420\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jvz5t" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.947463 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/98ebf2d2-03be-474a-a318-0e91e9164758-profile-collector-cert\") pod \"catalog-operator-68c6474976-nb2r4\" (UID: \"98ebf2d2-03be-474a-a318-0e91e9164758\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nb2r4" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.947486 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.947643 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/047424fe-da2d-4da3-a8b4-ff9bc5ae743f-csi-data-dir\") pod \"csi-hostpathplugin-xjqrf\" (UID: \"047424fe-da2d-4da3-a8b4-ff9bc5ae743f\") " pod="hostpath-provisioner/csi-hostpathplugin-xjqrf" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.947676 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ea7fccc2-5160-43d1-bbfa-7c6905d4a0d8-metrics-tls\") pod \"dns-default-w9kwc\" (UID: \"ea7fccc2-5160-43d1-bbfa-7c6905d4a0d8\") " pod="openshift-dns/dns-default-w9kwc" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.947697 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/047424fe-da2d-4da3-a8b4-ff9bc5ae743f-mountpoint-dir\") pod \"csi-hostpathplugin-xjqrf\" (UID: \"047424fe-da2d-4da3-a8b4-ff9bc5ae743f\") " pod="hostpath-provisioner/csi-hostpathplugin-xjqrf" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.947716 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmmxw\" (UniqueName: \"kubernetes.io/projected/fefb79e0-1f20-4a4f-bdf2-f3bad9d2a7c3-kube-api-access-vmmxw\") pod \"control-plane-machine-set-operator-78cbb6b69f-vfp9m\" (UID: \"fefb79e0-1f20-4a4f-bdf2-f3bad9d2a7c3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vfp9m" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.947741 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/ccc8d88d-863a-43b2-916d-b9df5a38453d-ready\") pod \"cni-sysctl-allowlist-ds-5scj2\" (UID: \"ccc8d88d-863a-43b2-916d-b9df5a38453d\") " pod="openshift-multus/cni-sysctl-allowlist-ds-5scj2" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.947756 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/01c95b1b-48b1-47a7-9da0-7e6c8ced735a-certs\") pod \"machine-config-server-cmx7r\" (UID: \"01c95b1b-48b1-47a7-9da0-7e6c8ced735a\") " pod="openshift-machine-config-operator/machine-config-server-cmx7r" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.947771 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/98ebf2d2-03be-474a-a318-0e91e9164758-srv-cert\") pod \"catalog-operator-68c6474976-nb2r4\" (UID: \"98ebf2d2-03be-474a-a318-0e91e9164758\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nb2r4" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.948141 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/047424fe-da2d-4da3-a8b4-ff9bc5ae743f-socket-dir\") pod \"csi-hostpathplugin-xjqrf\" (UID: \"047424fe-da2d-4da3-a8b4-ff9bc5ae743f\") " pod="hostpath-provisioner/csi-hostpathplugin-xjqrf" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.948334 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ccc8d88d-863a-43b2-916d-b9df5a38453d-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-5scj2\" (UID: \"ccc8d88d-863a-43b2-916d-b9df5a38453d\") " pod="openshift-multus/cni-sysctl-allowlist-ds-5scj2" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.949040 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6c073d0e-ac3a-4cb8-a611-8e5e0f99c3e9-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-kwbct\" (UID: \"6c073d0e-ac3a-4cb8-a611-8e5e0f99c3e9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kwbct" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.949803 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f9a78ea-4484-46dd-ab80-ac76fc089420-config\") pod \"service-ca-operator-777779d784-jvz5t\" (UID: \"7f9a78ea-4484-46dd-ab80-ac76fc089420\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jvz5t" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.951158 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/047424fe-da2d-4da3-a8b4-ff9bc5ae743f-plugins-dir\") pod \"csi-hostpathplugin-xjqrf\" (UID: \"047424fe-da2d-4da3-a8b4-ff9bc5ae743f\") " pod="hostpath-provisioner/csi-hostpathplugin-xjqrf" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.951254 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/047424fe-da2d-4da3-a8b4-ff9bc5ae743f-registration-dir\") pod \"csi-hostpathplugin-xjqrf\" (UID: \"047424fe-da2d-4da3-a8b4-ff9bc5ae743f\") " pod="hostpath-provisioner/csi-hostpathplugin-xjqrf" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.951272 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ea7fccc2-5160-43d1-bbfa-7c6905d4a0d8-config-volume\") pod \"dns-default-w9kwc\" (UID: \"ea7fccc2-5160-43d1-bbfa-7c6905d4a0d8\") " pod="openshift-dns/dns-default-w9kwc" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.951394 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ccc8d88d-863a-43b2-916d-b9df5a38453d-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-5scj2\" (UID: \"ccc8d88d-863a-43b2-916d-b9df5a38453d\") " pod="openshift-multus/cni-sysctl-allowlist-ds-5scj2" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.951834 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/ccc8d88d-863a-43b2-916d-b9df5a38453d-ready\") pod \"cni-sysctl-allowlist-ds-5scj2\" (UID: \"ccc8d88d-863a-43b2-916d-b9df5a38453d\") " pod="openshift-multus/cni-sysctl-allowlist-ds-5scj2" Mar 18 11:58:25 crc kubenswrapper[4965]: E0318 11:58:25.951836 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 11:58:26.451814294 +0000 UTC m=+111.438001773 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmt4c" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.952126 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/047424fe-da2d-4da3-a8b4-ff9bc5ae743f-mountpoint-dir\") pod \"csi-hostpathplugin-xjqrf\" (UID: \"047424fe-da2d-4da3-a8b4-ff9bc5ae743f\") " pod="hostpath-provisioner/csi-hostpathplugin-xjqrf" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.952334 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/047424fe-da2d-4da3-a8b4-ff9bc5ae743f-csi-data-dir\") pod \"csi-hostpathplugin-xjqrf\" (UID: \"047424fe-da2d-4da3-a8b4-ff9bc5ae743f\") " pod="hostpath-provisioner/csi-hostpathplugin-xjqrf" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.952969 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fb8ab568-8d37-4899-9337-600b4f41dfe5-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-q4h66\" (UID: \"fb8ab568-8d37-4899-9337-600b4f41dfe5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-q4h66" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.956138 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/98ebf2d2-03be-474a-a318-0e91e9164758-srv-cert\") pod \"catalog-operator-68c6474976-nb2r4\" (UID: \"98ebf2d2-03be-474a-a318-0e91e9164758\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nb2r4" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.956287 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6c073d0e-ac3a-4cb8-a611-8e5e0f99c3e9-proxy-tls\") pod \"machine-config-controller-84d6567774-kwbct\" (UID: \"6c073d0e-ac3a-4cb8-a611-8e5e0f99c3e9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kwbct" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.956375 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c66fcc55-a5d5-4886-afa8-33dd5be2f631-profile-collector-cert\") pod \"olm-operator-6b444d44fb-cv5z6\" (UID: \"c66fcc55-a5d5-4886-afa8-33dd5be2f631\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cv5z6" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.956541 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/fefb79e0-1f20-4a4f-bdf2-f3bad9d2a7c3-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-vfp9m\" (UID: \"fefb79e0-1f20-4a4f-bdf2-f3bad9d2a7c3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vfp9m" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.957234 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/19d4f554-489c-46b5-aba4-919ab05d8853-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-bksfk\" (UID: \"19d4f554-489c-46b5-aba4-919ab05d8853\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bksfk" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.958534 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/98ebf2d2-03be-474a-a318-0e91e9164758-profile-collector-cert\") pod \"catalog-operator-68c6474976-nb2r4\" (UID: \"98ebf2d2-03be-474a-a318-0e91e9164758\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nb2r4" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.958764 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/01c95b1b-48b1-47a7-9da0-7e6c8ced735a-certs\") pod \"machine-config-server-cmx7r\" (UID: \"01c95b1b-48b1-47a7-9da0-7e6c8ced735a\") " pod="openshift-machine-config-operator/machine-config-server-cmx7r" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.959397 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ea7fccc2-5160-43d1-bbfa-7c6905d4a0d8-metrics-tls\") pod \"dns-default-w9kwc\" (UID: \"ea7fccc2-5160-43d1-bbfa-7c6905d4a0d8\") " pod="openshift-dns/dns-default-w9kwc" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.959612 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f9a78ea-4484-46dd-ab80-ac76fc089420-serving-cert\") pod \"service-ca-operator-777779d784-jvz5t\" (UID: \"7f9a78ea-4484-46dd-ab80-ac76fc089420\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jvz5t" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.959614 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c66fcc55-a5d5-4886-afa8-33dd5be2f631-srv-cert\") pod \"olm-operator-6b444d44fb-cv5z6\" (UID: \"c66fcc55-a5d5-4886-afa8-33dd5be2f631\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cv5z6" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.963212 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/01c95b1b-48b1-47a7-9da0-7e6c8ced735a-node-bootstrap-token\") pod \"machine-config-server-cmx7r\" (UID: \"01c95b1b-48b1-47a7-9da0-7e6c8ced735a\") " pod="openshift-machine-config-operator/machine-config-server-cmx7r" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.963685 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ab0e8f3c-d83a-4b3a-aee7-f530f82761f8-cert\") pod \"ingress-canary-4xx64\" (UID: \"ab0e8f3c-d83a-4b3a-aee7-f530f82761f8\") " pod="openshift-ingress-canary/ingress-canary-4xx64" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.965462 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbg56\" (UniqueName: \"kubernetes.io/projected/8f3fbc5c-1a5c-408c-acc0-648113e6e9bc-kube-api-access-vbg56\") pod \"ingress-operator-5b745b69d9-r8xpr\" (UID: \"8f3fbc5c-1a5c-408c-acc0-648113e6e9bc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8xpr" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.965788 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8xpr" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.983013 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgcc8\" (UniqueName: \"kubernetes.io/projected/10476ddd-0e25-4297-8e49-ae3cb9d2d0c4-kube-api-access-wgcc8\") pod \"cluster-image-registry-operator-dc59b4c8b-wg6wd\" (UID: \"10476ddd-0e25-4297-8e49-ae3cb9d2d0c4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wg6wd" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.986060 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wg6wd" Mar 18 11:58:25 crc kubenswrapper[4965]: I0318 11:58:25.996720 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-4sskq" Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.010218 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64htn\" (UniqueName: \"kubernetes.io/projected/18b96e51-1a8b-4947-a9a3-9af0c0e88100-kube-api-access-64htn\") pod \"downloads-7954f5f757-q9zxt\" (UID: \"18b96e51-1a8b-4947-a9a3-9af0c0e88100\") " pod="openshift-console/downloads-7954f5f757-q9zxt" Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.019252 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-q4h66" Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.032896 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfnlh\" (UniqueName: \"kubernetes.io/projected/ad0ada2e-efc3-4a75-b954-15e8b893e2bf-kube-api-access-vfnlh\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.049077 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwm5l\" (UniqueName: \"kubernetes.io/projected/beb408f9-638c-4bd9-b4ec-c72bc43286e3-kube-api-access-dwm5l\") pod \"marketplace-operator-79b997595-8pdbx\" (UID: \"beb408f9-638c-4bd9-b4ec-c72bc43286e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-8pdbx" Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.049729 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:26 crc kubenswrapper[4965]: E0318 11:58:26.050259 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:26.550242056 +0000 UTC m=+111.536429535 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.092681 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqs42\" (UniqueName: \"kubernetes.io/projected/7f9a78ea-4484-46dd-ab80-ac76fc089420-kube-api-access-mqs42\") pod \"service-ca-operator-777779d784-jvz5t\" (UID: \"7f9a78ea-4484-46dd-ab80-ac76fc089420\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jvz5t" Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.100817 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-q9zxt" Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.104888 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-jvz5t" Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.110830 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm28w\" (UniqueName: \"kubernetes.io/projected/ccc8d88d-863a-43b2-916d-b9df5a38453d-kube-api-access-cm28w\") pod \"cni-sysctl-allowlist-ds-5scj2\" (UID: \"ccc8d88d-863a-43b2-916d-b9df5a38453d\") " pod="openshift-multus/cni-sysctl-allowlist-ds-5scj2" Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.126619 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n57b4\" (UniqueName: \"kubernetes.io/projected/01c95b1b-48b1-47a7-9da0-7e6c8ced735a-kube-api-access-n57b4\") pod \"machine-config-server-cmx7r\" (UID: \"01c95b1b-48b1-47a7-9da0-7e6c8ced735a\") " pod="openshift-machine-config-operator/machine-config-server-cmx7r" Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.149785 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-5scj2" Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.152779 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jcl7\" (UniqueName: \"kubernetes.io/projected/c66fcc55-a5d5-4886-afa8-33dd5be2f631-kube-api-access-8jcl7\") pod \"olm-operator-6b444d44fb-cv5z6\" (UID: \"c66fcc55-a5d5-4886-afa8-33dd5be2f631\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cv5z6" Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.153996 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.157916 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-6w7zg"] Mar 18 11:58:26 crc kubenswrapper[4965]: E0318 11:58:26.159138 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 11:58:26.659119286 +0000 UTC m=+111.645306765 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmt4c" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.166605 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wfdq\" (UniqueName: \"kubernetes.io/projected/ab0e8f3c-d83a-4b3a-aee7-f530f82761f8-kube-api-access-9wfdq\") pod \"ingress-canary-4xx64\" (UID: \"ab0e8f3c-d83a-4b3a-aee7-f530f82761f8\") " pod="openshift-ingress-canary/ingress-canary-4xx64" Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.173418 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-cmx7r" Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.180715 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4xx64" Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.196321 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdcrt\" (UniqueName: \"kubernetes.io/projected/19d4f554-489c-46b5-aba4-919ab05d8853-kube-api-access-vdcrt\") pod \"package-server-manager-789f6589d5-bksfk\" (UID: \"19d4f554-489c-46b5-aba4-919ab05d8853\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bksfk" Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.199236 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sflt7"] Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.200974 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563905-l6rnm"] Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.204951 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-llx6v"] Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.229405 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flzcg\" (UniqueName: \"kubernetes.io/projected/6c073d0e-ac3a-4cb8-a611-8e5e0f99c3e9-kube-api-access-flzcg\") pod \"machine-config-controller-84d6567774-kwbct\" (UID: \"6c073d0e-ac3a-4cb8-a611-8e5e0f99c3e9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kwbct" Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.230407 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4bbq\" (UniqueName: \"kubernetes.io/projected/047424fe-da2d-4da3-a8b4-ff9bc5ae743f-kube-api-access-l4bbq\") pod \"csi-hostpathplugin-xjqrf\" (UID: \"047424fe-da2d-4da3-a8b4-ff9bc5ae743f\") " pod="hostpath-provisioner/csi-hostpathplugin-xjqrf" Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.245084 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22snl\" (UniqueName: \"kubernetes.io/projected/ea7fccc2-5160-43d1-bbfa-7c6905d4a0d8-kube-api-access-22snl\") pod \"dns-default-w9kwc\" (UID: \"ea7fccc2-5160-43d1-bbfa-7c6905d4a0d8\") " pod="openshift-dns/dns-default-w9kwc" Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.256565 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:26 crc kubenswrapper[4965]: E0318 11:58:26.257051 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:26.757027784 +0000 UTC m=+111.743215263 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.262895 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klczw\" (UniqueName: \"kubernetes.io/projected/e712d1d3-e146-41c2-a5ef-610e4bb8771c-kube-api-access-klczw\") pod \"migrator-59844c95c7-2dm7f\" (UID: \"e712d1d3-e146-41c2-a5ef-610e4bb8771c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2dm7f" Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.324983 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcw4z\" (UniqueName: \"kubernetes.io/projected/98ebf2d2-03be-474a-a318-0e91e9164758-kube-api-access-dcw4z\") pod \"catalog-operator-68c6474976-nb2r4\" (UID: \"98ebf2d2-03be-474a-a318-0e91e9164758\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nb2r4" Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.332485 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8pdbx" Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.347463 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmmxw\" (UniqueName: \"kubernetes.io/projected/fefb79e0-1f20-4a4f-bdf2-f3bad9d2a7c3-kube-api-access-vmmxw\") pod \"control-plane-machine-set-operator-78cbb6b69f-vfp9m\" (UID: \"fefb79e0-1f20-4a4f-bdf2-f3bad9d2a7c3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vfp9m" Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.358634 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:26 crc kubenswrapper[4965]: E0318 11:58:26.359342 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 11:58:26.859327943 +0000 UTC m=+111.845515412 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmt4c" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.365738 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-qdbck"] Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.380603 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bksfk" Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.382378 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wbdxj"] Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.386988 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-b4nfn"] Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.390440 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vfp9m" Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.405755 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-fgg7z"] Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.409157 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kwbct" Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.418738 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2dm7f" Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.436709 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cv5z6" Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.444637 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nb2r4" Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.459489 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-w9kwc" Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.465336 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:26 crc kubenswrapper[4965]: E0318 11:58:26.465717 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:26.965702174 +0000 UTC m=+111.951889653 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:26 crc kubenswrapper[4965]: W0318 11:58:26.475792 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3e59a9f_c0ea_4a0e_a39a_f332e3eba1a4.slice/crio-4d015ec3c518495867752be11e9f7bc0871b6d7219d846015fc00c8c813d4826 WatchSource:0}: Error finding container 4d015ec3c518495867752be11e9f7bc0871b6d7219d846015fc00c8c813d4826: Status 404 returned error can't find the container with id 4d015ec3c518495867752be11e9f7bc0871b6d7219d846015fc00c8c813d4826 Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.502570 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-xjqrf" Mar 18 11:58:26 crc kubenswrapper[4965]: W0318 11:58:26.507406 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bb6cf60_3071_4f09_b9aa_e5d05b03804e.slice/crio-0a221e30edebcbb07787da4ed4177814a94ad640089c53183ff11c58c1440902 WatchSource:0}: Error finding container 0a221e30edebcbb07787da4ed4177814a94ad640089c53183ff11c58c1440902: Status 404 returned error can't find the container with id 0a221e30edebcbb07787da4ed4177814a94ad640089c53183ff11c58c1440902 Mar 18 11:58:26 crc kubenswrapper[4965]: W0318 11:58:26.544226 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a9e071f_3977_4a85_9b66_30c59efd7d3a.slice/crio-a45303dc5c656cdbdb9d602cc515632a6fd3b684ec769c48ff40319c762e43e3 WatchSource:0}: Error finding container a45303dc5c656cdbdb9d602cc515632a6fd3b684ec769c48ff40319c762e43e3: Status 404 returned error can't find the container with id a45303dc5c656cdbdb9d602cc515632a6fd3b684ec769c48ff40319c762e43e3 Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.555575 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dddvm"] Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.566596 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:26 crc kubenswrapper[4965]: E0318 11:58:26.567115 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 11:58:27.067100678 +0000 UTC m=+112.053288157 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmt4c" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.589649 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wszrt" event={"ID":"f7011ee6-49c9-4b89-ae23-545bac0f68d0","Type":"ContainerStarted","Data":"44d900e6169c8fa607e5ac638277268bff6bb1505373d10259d70f1908777704"} Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.595138 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sw5p7"] Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.595249 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-fgg7z" event={"ID":"c3e59a9f-c0ea-4a0e-a39a-f332e3eba1a4","Type":"ContainerStarted","Data":"4d015ec3c518495867752be11e9f7bc0871b6d7219d846015fc00c8c813d4826"} Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.604630 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-sk66r"] Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.610250 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s2wpl" event={"ID":"b0c9543d-7fc2-45cb-a925-f3c030a3c8cf","Type":"ContainerStarted","Data":"470cdf496a332f0b5df657d0e7994c978c58acb74e63319ffa2c001a71d7f0bf"} Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.610300 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s2wpl" event={"ID":"b0c9543d-7fc2-45cb-a925-f3c030a3c8cf","Type":"ContainerStarted","Data":"4c3ce47ae65593763827955f132164b026fe6d9612f386578bd74c0e3b8816b8"} Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.613172 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b4nfn" event={"ID":"89a368d8-7c06-47c3-9a5f-0c5cc5a5cd47","Type":"ContainerStarted","Data":"c7afae4cfd1af1b7b9cdd5e9c1021dee7c87d7c24757fa80246ce8a27c0ecdc0"} Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.615916 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rwvcf" event={"ID":"ae68bf3c-11de-49ed-8fc2-a669cb8a04fa","Type":"ContainerStarted","Data":"b0f45b64f60598cd2d6934e8556d33fb84fb98fb9bba88793f8becd91323dd70"} Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.617281 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sflt7" event={"ID":"5b1dc0d2-30e8-46f0-a933-cd321c32590b","Type":"ContainerStarted","Data":"df056ec2e4f8c81dcbe0d338b2ec80c8db10dc408aea2f9cdab53734134c1785"} Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.642607 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qdbck" event={"ID":"7a9e071f-3977-4a85-9b66-30c59efd7d3a","Type":"ContainerStarted","Data":"a45303dc5c656cdbdb9d602cc515632a6fd3b684ec769c48ff40319c762e43e3"} Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.656261 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-xhsv5" event={"ID":"ea2b4f76-5f61-410a-bb32-c1185e630c7a","Type":"ContainerStarted","Data":"abb477666e5d189697c4ace461c1f2d9d15a238f143aac5fc936c014d83f4eb5"} Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.659251 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-xhsv5" event={"ID":"ea2b4f76-5f61-410a-bb32-c1185e630c7a","Type":"ContainerStarted","Data":"06a133ed4fd61cf5f94bad79b051c56b9813fa5a3390ce34a01452faa3a3fa33"} Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.659614 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-cmx7r" event={"ID":"01c95b1b-48b1-47a7-9da0-7e6c8ced735a","Type":"ContainerStarted","Data":"7c59e1591ea39483aa1f4380a4eb43d3656f793d6fa1474a6f63e3a417a8bc75"} Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.662919 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563905-l6rnm" event={"ID":"2aaf7d7a-c623-4198-b68e-7efa895cb96f","Type":"ContainerStarted","Data":"efc9f89a90861ced30a0f130b5fd9c8b0d25603a681a3a7a811e0b54f1245446"} Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.667094 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wbdxj" event={"ID":"9bb6cf60-3071-4f09-b9aa-e5d05b03804e","Type":"ContainerStarted","Data":"0a221e30edebcbb07787da4ed4177814a94ad640089c53183ff11c58c1440902"} Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.667459 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:26 crc kubenswrapper[4965]: E0318 11:58:26.669050 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:27.169033196 +0000 UTC m=+112.155220675 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.669853 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:26 crc kubenswrapper[4965]: E0318 11:58:26.670557 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 11:58:27.170548688 +0000 UTC m=+112.156736167 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmt4c" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.723579 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-xhsv5" Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.725289 4965 patch_prober.go:28] interesting pod/router-default-5444994796-xhsv5 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.725335 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhsv5" podUID="ea2b4f76-5f61-410a-bb32-c1185e630c7a" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.730635 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ntz44" event={"ID":"36b1bb7c-dba1-491c-84ba-820adbe96f3a","Type":"ContainerStarted","Data":"5d28f9739e171eb1422de1dbfbd0544cdcbdaf9f146d4f53ea5381a6eb6a4e5d"} Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.749153 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-6w7zg" event={"ID":"9b045370-3446-4fe4-b7c4-d0b12ddd06f4","Type":"ContainerStarted","Data":"599f77b90586c5e1e6fd87151157f069ff1557005d5d175662c43f09e8021062"} Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.768371 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-llx6v" event={"ID":"081ced5c-b889-43c7-a7a5-bb6561452f4a","Type":"ContainerStarted","Data":"54cf423c84461c583b56518e193e2914b95bfd27b35bb4b43c1c016e2a5339e9"} Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.771717 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:26 crc kubenswrapper[4965]: E0318 11:58:26.774178 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:27.274154373 +0000 UTC m=+112.260341872 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.786706 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nqqmj" event={"ID":"4a9a21f3-0936-44a9-b1f0-e82345703f0a","Type":"ContainerStarted","Data":"a7e9a36bf74e0a89fc6fcbde3f74d3ceb57bd31d76d7755517cc4e703eb286bd"} Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.786749 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nqqmj" event={"ID":"4a9a21f3-0936-44a9-b1f0-e82345703f0a","Type":"ContainerStarted","Data":"b8391859c9bdcacaa7aa45e9fe88f503f54c82e9cc659fcb98f7e066208ee525"} Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.797452 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-k68bp" event={"ID":"3d252be2-ace2-40c4-a4e3-3824d9bffff4","Type":"ContainerStarted","Data":"2cb35ecd19e5ef8198d4168a0f57e0193322e9c7bb7fad5fad3131fa9050e391"} Mar 18 11:58:26 crc kubenswrapper[4965]: W0318 11:58:26.817304 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d01f52f_de93_4fa8_80c2_88b21f4e6400.slice/crio-57a3cacd616d800fd70f928b20bd045f63a616336e7d123c73ce5a76543c0387 WatchSource:0}: Error finding container 57a3cacd616d800fd70f928b20bd045f63a616336e7d123c73ce5a76543c0387: Status 404 returned error can't find the container with id 57a3cacd616d800fd70f928b20bd045f63a616336e7d123c73ce5a76543c0387 Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.826546 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-5scj2" event={"ID":"ccc8d88d-863a-43b2-916d-b9df5a38453d","Type":"ContainerStarted","Data":"b4c6d0c88c9db38dc49b013fde4d006b99a97e8b220d1823a382856641390973"} Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.835528 4965 generic.go:334] "Generic (PLEG): container finished" podID="8b27563b-d715-4dfa-84b6-5d0f1d90e4b5" containerID="f47b336a2ea27f7fc44e8a084f02ac7efd45bff07e01cc44aae76c059cde7bc5" exitCode=0 Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.835598 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fs56m" event={"ID":"8b27563b-d715-4dfa-84b6-5d0f1d90e4b5","Type":"ContainerDied","Data":"f47b336a2ea27f7fc44e8a084f02ac7efd45bff07e01cc44aae76c059cde7bc5"} Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.835625 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fs56m" event={"ID":"8b27563b-d715-4dfa-84b6-5d0f1d90e4b5","Type":"ContainerStarted","Data":"c5b04efc5e895328197913f1a0a715db8242d9a2aa375ca7e8e9de64977ad74b"} Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.839231 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-zptvt" event={"ID":"9e7d665e-a211-432c-8884-74096504ab5c","Type":"ContainerStarted","Data":"e1d482cbecea0a2e08c5fcb44998491c3ed53d1d3d221f4d0c1fb2934de255bc"} Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.878893 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:26 crc kubenswrapper[4965]: E0318 11:58:26.882383 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 11:58:27.382363853 +0000 UTC m=+112.368551432 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmt4c" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.892345 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-mmzsn"] Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.919290 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-q9zxt"] Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.926715 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-r8xpr"] Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.946024 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5x9mq"] Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.972492 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wg6wd"] Mar 18 11:58:26 crc kubenswrapper[4965]: I0318 11:58:26.979672 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:26 crc kubenswrapper[4965]: E0318 11:58:26.980087 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:27.480067475 +0000 UTC m=+112.466254954 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:27 crc kubenswrapper[4965]: W0318 11:58:27.000548 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29015bb8_e604_425d_a88b_db3ec9e10096.slice/crio-2fe8dcd9b2d990422de9e6ea4ee1c7cede7f481e3c0b96f2cf74bf78be0b088a WatchSource:0}: Error finding container 2fe8dcd9b2d990422de9e6ea4ee1c7cede7f481e3c0b96f2cf74bf78be0b088a: Status 404 returned error can't find the container with id 2fe8dcd9b2d990422de9e6ea4ee1c7cede7f481e3c0b96f2cf74bf78be0b088a Mar 18 11:58:27 crc kubenswrapper[4965]: I0318 11:58:27.090125 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:27 crc kubenswrapper[4965]: E0318 11:58:27.090770 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 11:58:27.590729825 +0000 UTC m=+112.576917304 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmt4c" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:27 crc kubenswrapper[4965]: I0318 11:58:27.130524 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-qtnkp" Mar 18 11:58:27 crc kubenswrapper[4965]: W0318 11:58:27.150776 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10476ddd_0e25_4297_8e49_ae3cb9d2d0c4.slice/crio-38ba8711c011b4b95953cf268a9b31feb2e9d36c22253a2b832cf99d649e585c WatchSource:0}: Error finding container 38ba8711c011b4b95953cf268a9b31feb2e9d36c22253a2b832cf99d649e585c: Status 404 returned error can't find the container with id 38ba8711c011b4b95953cf268a9b31feb2e9d36c22253a2b832cf99d649e585c Mar 18 11:58:27 crc kubenswrapper[4965]: I0318 11:58:27.174337 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vfp9m"] Mar 18 11:58:27 crc kubenswrapper[4965]: I0318 11:58:27.191292 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:27 crc kubenswrapper[4965]: E0318 11:58:27.192651 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:27.692630712 +0000 UTC m=+112.678818191 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:27 crc kubenswrapper[4965]: I0318 11:58:27.192780 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:27 crc kubenswrapper[4965]: E0318 11:58:27.193140 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 11:58:27.693131986 +0000 UTC m=+112.679319465 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmt4c" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:27 crc kubenswrapper[4965]: I0318 11:58:27.209110 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4sskq"] Mar 18 11:58:27 crc kubenswrapper[4965]: I0318 11:58:27.231152 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-jvz5t"] Mar 18 11:58:27 crc kubenswrapper[4965]: I0318 11:58:27.256150 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-q4h66"] Mar 18 11:58:27 crc kubenswrapper[4965]: W0318 11:58:27.292508 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf98952eb_6e5b_4114_abf8_f440a0c97b76.slice/crio-b1c2570481a8fedbd5a8328011ca702f3a409701836f3bc60a1af5254a731a46 WatchSource:0}: Error finding container b1c2570481a8fedbd5a8328011ca702f3a409701836f3bc60a1af5254a731a46: Status 404 returned error can't find the container with id b1c2570481a8fedbd5a8328011ca702f3a409701836f3bc60a1af5254a731a46 Mar 18 11:58:27 crc kubenswrapper[4965]: I0318 11:58:27.304732 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:27 crc kubenswrapper[4965]: E0318 11:58:27.305033 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:27.805001159 +0000 UTC m=+112.791188638 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:27 crc kubenswrapper[4965]: I0318 11:58:27.305189 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:27 crc kubenswrapper[4965]: E0318 11:58:27.305783 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 11:58:27.805752679 +0000 UTC m=+112.791940158 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmt4c" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:27 crc kubenswrapper[4965]: I0318 11:58:27.347493 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8pdbx"] Mar 18 11:58:27 crc kubenswrapper[4965]: I0318 11:58:27.374899 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4xx64"] Mar 18 11:58:27 crc kubenswrapper[4965]: W0318 11:58:27.405184 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbeb408f9_638c_4bd9_b4ec_c72bc43286e3.slice/crio-e59d8ce29c399f6c8778404556cd9f40a79f29b6aac26619b624302a8b88dc72 WatchSource:0}: Error finding container e59d8ce29c399f6c8778404556cd9f40a79f29b6aac26619b624302a8b88dc72: Status 404 returned error can't find the container with id e59d8ce29c399f6c8778404556cd9f40a79f29b6aac26619b624302a8b88dc72 Mar 18 11:58:27 crc kubenswrapper[4965]: I0318 11:58:27.406243 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:27 crc kubenswrapper[4965]: E0318 11:58:27.406586 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:27.906570187 +0000 UTC m=+112.892757666 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:27 crc kubenswrapper[4965]: I0318 11:58:27.430851 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nb2r4"] Mar 18 11:58:27 crc kubenswrapper[4965]: I0318 11:58:27.477778 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bksfk"] Mar 18 11:58:27 crc kubenswrapper[4965]: I0318 11:58:27.486854 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-pj8km" podStartSLOduration=51.486836599 podStartE2EDuration="51.486836599s" podCreationTimestamp="2026-03-18 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:27.484138404 +0000 UTC m=+112.470325893" watchObservedRunningTime="2026-03-18 11:58:27.486836599 +0000 UTC m=+112.473024078" Mar 18 11:58:27 crc kubenswrapper[4965]: I0318 11:58:27.508633 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:27 crc kubenswrapper[4965]: I0318 11:58:27.508968 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b676c03-201d-403c-8082-84451760c106-metrics-certs\") pod \"network-metrics-daemon-9jx9z\" (UID: \"1b676c03-201d-403c-8082-84451760c106\") " pod="openshift-multus/network-metrics-daemon-9jx9z" Mar 18 11:58:27 crc kubenswrapper[4965]: E0318 11:58:27.509578 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 11:58:28.009565675 +0000 UTC m=+112.995753154 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmt4c" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:27 crc kubenswrapper[4965]: I0318 11:58:27.533289 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-xjqrf"] Mar 18 11:58:27 crc kubenswrapper[4965]: I0318 11:58:27.537020 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b676c03-201d-403c-8082-84451760c106-metrics-certs\") pod \"network-metrics-daemon-9jx9z\" (UID: \"1b676c03-201d-403c-8082-84451760c106\") " pod="openshift-multus/network-metrics-daemon-9jx9z" Mar 18 11:58:27 crc kubenswrapper[4965]: I0318 11:58:27.551250 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-2dm7f"] Mar 18 11:58:27 crc kubenswrapper[4965]: I0318 11:58:27.565512 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cv5z6"] Mar 18 11:58:27 crc kubenswrapper[4965]: I0318 11:58:27.573387 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-w9kwc"] Mar 18 11:58:27 crc kubenswrapper[4965]: I0318 11:58:27.577753 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-kwbct"] Mar 18 11:58:27 crc kubenswrapper[4965]: I0318 11:58:27.611494 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:27 crc kubenswrapper[4965]: E0318 11:58:27.613923 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:28.11390646 +0000 UTC m=+113.100093939 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:27 crc kubenswrapper[4965]: I0318 11:58:27.646083 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9jx9z" Mar 18 11:58:27 crc kubenswrapper[4965]: W0318 11:58:27.671232 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc66fcc55_a5d5_4886_afa8_33dd5be2f631.slice/crio-7553cac2b5e174187444195e9865f98b235ea3be0702c94f521a16d113ca1e69 WatchSource:0}: Error finding container 7553cac2b5e174187444195e9865f98b235ea3be0702c94f521a16d113ca1e69: Status 404 returned error can't find the container with id 7553cac2b5e174187444195e9865f98b235ea3be0702c94f521a16d113ca1e69 Mar 18 11:58:27 crc kubenswrapper[4965]: I0318 11:58:27.712727 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:27 crc kubenswrapper[4965]: E0318 11:58:27.713040 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 11:58:28.213029351 +0000 UTC m=+113.199216820 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmt4c" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:27 crc kubenswrapper[4965]: W0318 11:58:27.718255 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea7fccc2_5160_43d1_bbfa_7c6905d4a0d8.slice/crio-b69dc16c38f0a07f1977a6a3ed16ef2bcfbf4c4709c3a6b1203b560d70c37476 WatchSource:0}: Error finding container b69dc16c38f0a07f1977a6a3ed16ef2bcfbf4c4709c3a6b1203b560d70c37476: Status 404 returned error can't find the container with id b69dc16c38f0a07f1977a6a3ed16ef2bcfbf4c4709c3a6b1203b560d70c37476 Mar 18 11:58:27 crc kubenswrapper[4965]: I0318 11:58:27.740885 4965 patch_prober.go:28] interesting pod/router-default-5444994796-xhsv5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 11:58:27 crc kubenswrapper[4965]: [-]has-synced failed: reason withheld Mar 18 11:58:27 crc kubenswrapper[4965]: [+]process-running ok Mar 18 11:58:27 crc kubenswrapper[4965]: healthz check failed Mar 18 11:58:27 crc kubenswrapper[4965]: I0318 11:58:27.740945 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhsv5" podUID="ea2b4f76-5f61-410a-bb32-c1185e630c7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 11:58:27 crc kubenswrapper[4965]: I0318 11:58:27.813951 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:27 crc kubenswrapper[4965]: E0318 11:58:27.814321 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:28.314294841 +0000 UTC m=+113.300482360 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:27 crc kubenswrapper[4965]: I0318 11:58:27.916938 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:27 crc kubenswrapper[4965]: E0318 11:58:27.917476 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 11:58:28.417453354 +0000 UTC m=+113.403640833 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmt4c" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:27 crc kubenswrapper[4965]: I0318 11:58:27.923972 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-fgg7z" event={"ID":"c3e59a9f-c0ea-4a0e-a39a-f332e3eba1a4","Type":"ContainerStarted","Data":"d1f973892aabb8d44815e5287084da8f89e2359a35c3692ec590f18b44eff5a8"} Mar 18 11:58:27 crc kubenswrapper[4965]: I0318 11:58:27.927941 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cv5z6" event={"ID":"c66fcc55-a5d5-4886-afa8-33dd5be2f631","Type":"ContainerStarted","Data":"7553cac2b5e174187444195e9865f98b235ea3be0702c94f521a16d113ca1e69"} Mar 18 11:58:27 crc kubenswrapper[4965]: I0318 11:58:27.931935 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-k68bp" podStartSLOduration=51.931917722 podStartE2EDuration="51.931917722s" podCreationTimestamp="2026-03-18 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:27.923647654 +0000 UTC m=+112.909835133" watchObservedRunningTime="2026-03-18 11:58:27.931917722 +0000 UTC m=+112.918105201" Mar 18 11:58:27 crc kubenswrapper[4965]: I0318 11:58:27.957331 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kwbct" event={"ID":"6c073d0e-ac3a-4cb8-a611-8e5e0f99c3e9","Type":"ContainerStarted","Data":"52c52eda1b89cd34717c437bb8ca2df5ce3c1d6dbfd6b4d8d96f633e7805b1df"} Mar 18 11:58:27 crc kubenswrapper[4965]: I0318 11:58:27.975726 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nqqmj" podStartSLOduration=51.975706309 podStartE2EDuration="51.975706309s" podCreationTimestamp="2026-03-18 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:27.954775602 +0000 UTC m=+112.940963101" watchObservedRunningTime="2026-03-18 11:58:27.975706309 +0000 UTC m=+112.961893788" Mar 18 11:58:27 crc kubenswrapper[4965]: I0318 11:58:27.976208 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-ntz44" podStartSLOduration=51.976202982 podStartE2EDuration="51.976202982s" podCreationTimestamp="2026-03-18 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:27.974451254 +0000 UTC m=+112.960638743" watchObservedRunningTime="2026-03-18 11:58:27.976202982 +0000 UTC m=+112.962390461" Mar 18 11:58:27 crc kubenswrapper[4965]: I0318 11:58:27.978822 4965 generic.go:334] "Generic (PLEG): container finished" podID="7a9e071f-3977-4a85-9b66-30c59efd7d3a" containerID="e0fc6e860691a9d12855adac036babd2b09549b376c396ef20725f4f92e3923b" exitCode=0 Mar 18 11:58:27 crc kubenswrapper[4965]: I0318 11:58:27.978933 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qdbck" event={"ID":"7a9e071f-3977-4a85-9b66-30c59efd7d3a","Type":"ContainerDied","Data":"e0fc6e860691a9d12855adac036babd2b09549b376c396ef20725f4f92e3923b"} Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.009546 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-q9zxt" event={"ID":"18b96e51-1a8b-4947-a9a3-9af0c0e88100","Type":"ContainerStarted","Data":"b3917ecaacbbfcdc5acf018f1eb62eeed4ecd1afb2dc476b8a2d21e73253468a"} Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.020275 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:28 crc kubenswrapper[4965]: E0318 11:58:28.020723 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:28.520700738 +0000 UTC m=+113.506888217 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.038635 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xjqrf" event={"ID":"047424fe-da2d-4da3-a8b4-ff9bc5ae743f","Type":"ContainerStarted","Data":"f2a1d7f089b64a4eb33e3cc1b3151157c5b102b2992c0005f61d1e7d7f46272c"} Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.046149 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563905-l6rnm" event={"ID":"2aaf7d7a-c623-4198-b68e-7efa895cb96f","Type":"ContainerStarted","Data":"ca308b01598a8ea855134f9818c3d3e759c37ff50919fd540d1d6e550e2f213c"} Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.047548 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-xhsv5" podStartSLOduration=52.047529008 podStartE2EDuration="52.047529008s" podCreationTimestamp="2026-03-18 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:28.04653019 +0000 UTC m=+113.032717679" watchObservedRunningTime="2026-03-18 11:58:28.047529008 +0000 UTC m=+113.033716487" Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.048844 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wszrt" event={"ID":"f7011ee6-49c9-4b89-ae23-545bac0f68d0","Type":"ContainerStarted","Data":"02d5cf686ddd49d95d2e02cb545d8291e530f1dde1edd4953cab73db613b590a"} Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.051188 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5x9mq" event={"ID":"2c56d894-8d33-4682-b8ea-4d99fc3bb1b7","Type":"ContainerStarted","Data":"13d30575086fb5f3f678c045140f124f319d05bc36b9983f41d0722132c2755e"} Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.052334 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bksfk" event={"ID":"19d4f554-489c-46b5-aba4-919ab05d8853","Type":"ContainerStarted","Data":"a00c75fb5f945313926aeea932850e3bff0e3e1dc1b6da5aac892ac417225adf"} Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.054453 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-zptvt" event={"ID":"9e7d665e-a211-432c-8884-74096504ab5c","Type":"ContainerStarted","Data":"e9e3ddf4167ecf1950b093dbf2b790e774b3cd1f1acad371cd46f59d5f022a42"} Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.054487 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-zptvt" event={"ID":"9e7d665e-a211-432c-8884-74096504ab5c","Type":"ContainerStarted","Data":"878b2ae9d4d6386f9876e2e262ef5d081e3a6d8a74d80bd5a42b167b320b83f9"} Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.058493 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8pdbx" event={"ID":"beb408f9-638c-4bd9-b4ec-c72bc43286e3","Type":"ContainerStarted","Data":"e59d8ce29c399f6c8778404556cd9f40a79f29b6aac26619b624302a8b88dc72"} Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.083831 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wg6wd" event={"ID":"10476ddd-0e25-4297-8e49-ae3cb9d2d0c4","Type":"ContainerStarted","Data":"38ba8711c011b4b95953cf268a9b31feb2e9d36c22253a2b832cf99d649e585c"} Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.102227 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" event={"ID":"8aa829e4-03d9-4359-9d6e-a7ce76a2072b","Type":"ContainerStarted","Data":"799bd116e83b286f94d1f2d2a93bbe96850dab88e087c903c4de9d37496f53f7"} Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.106558 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sflt7" event={"ID":"5b1dc0d2-30e8-46f0-a933-cd321c32590b","Type":"ContainerStarted","Data":"fa325dc136e7377d3f42760f4c479f75c83588d8c67b5f51456cd89de2a555ac"} Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.121764 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:28 crc kubenswrapper[4965]: E0318 11:58:28.123014 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 11:58:28.623002627 +0000 UTC m=+113.609190106 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmt4c" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.125934 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s2wpl" event={"ID":"b0c9543d-7fc2-45cb-a925-f3c030a3c8cf","Type":"ContainerStarted","Data":"6348f0eda1c1d62817e3b759a0ad3f7ddd232a7f56476fc1939f9c09df1acf73"} Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.128943 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-6w7zg" event={"ID":"9b045370-3446-4fe4-b7c4-d0b12ddd06f4","Type":"ContainerStarted","Data":"39117208c76779ffcc18d9075116ed4c7fc04dab720ebfe145d4ee7caa5d5c3b"} Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.133506 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29563905-l6rnm" podStartSLOduration=52.133485476 podStartE2EDuration="52.133485476s" podCreationTimestamp="2026-03-18 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:28.090124561 +0000 UTC m=+113.076312040" watchObservedRunningTime="2026-03-18 11:58:28.133485476 +0000 UTC m=+113.119672955" Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.152588 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-jvz5t" event={"ID":"7f9a78ea-4484-46dd-ab80-ac76fc089420","Type":"ContainerStarted","Data":"5b44dca8847c4bcaba1a92d43f4b4b8712728f2ef1fc04dd904fe768de2f1018"} Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.155838 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-mmzsn" event={"ID":"6cfd1d38-a35c-47ab-962f-4403875e5a19","Type":"ContainerStarted","Data":"fbdc825a902b577dde5200419eb296775ac76507f1d9783682163b0a3166aebe"} Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.159319 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wbdxj" event={"ID":"9bb6cf60-3071-4f09-b9aa-e5d05b03804e","Type":"ContainerStarted","Data":"4d684ab2b0dba807dc9dd2e958ee9b1a1d7bf42d014324a6d0aa37f57df19a31"} Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.190645 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vfp9m" event={"ID":"fefb79e0-1f20-4a4f-bdf2-f3bad9d2a7c3","Type":"ContainerStarted","Data":"5225bfea2bd9f2dde43041830339b9e92f44532da2f1669f45b560deb0bfb9bc"} Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.199323 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s2wpl" podStartSLOduration=52.1993071 podStartE2EDuration="52.1993071s" podCreationTimestamp="2026-03-18 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:28.198844407 +0000 UTC m=+113.185031886" watchObservedRunningTime="2026-03-18 11:58:28.1993071 +0000 UTC m=+113.185494579" Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.216073 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-llx6v" event={"ID":"081ced5c-b889-43c7-a7a5-bb6561452f4a","Type":"ContainerStarted","Data":"6e9f66265fed122caae11f02c02acdf0efea97505345a1b980693014002a2591"} Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.219490 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-q4h66" event={"ID":"fb8ab568-8d37-4899-9337-600b4f41dfe5","Type":"ContainerStarted","Data":"db42ac971ca31fee5dce770949c99896d6b18fb6657c364c44191320595074ba"} Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.220448 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dddvm" event={"ID":"8d01f52f-de93-4fa8-80c2-88b21f4e6400","Type":"ContainerStarted","Data":"57a3cacd616d800fd70f928b20bd045f63a616336e7d123c73ce5a76543c0387"} Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.220962 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dddvm" Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.222181 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:28 crc kubenswrapper[4965]: E0318 11:58:28.224609 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:28.724593906 +0000 UTC m=+113.710781385 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.232406 4965 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-dddvm container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:5443/healthz\": dial tcp 10.217.0.31:5443: connect: connection refused" start-of-body= Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.232454 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dddvm" podUID="8d01f52f-de93-4fa8-80c2-88b21f4e6400" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.31:5443/healthz\": dial tcp 10.217.0.31:5443: connect: connection refused" Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.249038 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b4nfn" event={"ID":"89a368d8-7c06-47c3-9a5f-0c5cc5a5cd47","Type":"ContainerStarted","Data":"b2211f45bc30f734f1457aaa7b9b2d1f44606a90dae8d8c7f512af14d38fa170"} Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.260974 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sflt7" podStartSLOduration=52.260954718 podStartE2EDuration="52.260954718s" podCreationTimestamp="2026-03-18 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:28.258961873 +0000 UTC m=+113.245149352" watchObservedRunningTime="2026-03-18 11:58:28.260954718 +0000 UTC m=+113.247142197" Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.277799 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8xpr" event={"ID":"8f3fbc5c-1a5c-408c-acc0-648113e6e9bc","Type":"ContainerStarted","Data":"3b1b4ede37c52a7f5ecca4e9bd0f0860a664919f9291483c9f20a42450de34f6"} Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.278557 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nb2r4" event={"ID":"98ebf2d2-03be-474a-a318-0e91e9164758","Type":"ContainerStarted","Data":"08d02bceaa60d0e3f277719b1d7b84f2e23f859ad602a8fe717c866ae2d4340f"} Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.279489 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-cmx7r" event={"ID":"01c95b1b-48b1-47a7-9da0-7e6c8ced735a","Type":"ContainerStarted","Data":"1ced77b27acf14a6e2209588b3e8b74645da50dec980e81ca264e1e687e021b0"} Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.300519 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2dm7f" event={"ID":"e712d1d3-e146-41c2-a5ef-610e4bb8771c","Type":"ContainerStarted","Data":"702e6be0e6213ca3fce457a6870be5792ff77543c22d323b99219e636f186d56"} Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.319857 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4xx64" event={"ID":"ab0e8f3c-d83a-4b3a-aee7-f530f82761f8","Type":"ContainerStarted","Data":"e11ba2864208d8b1f2f908aa6dc311d8284c784a418912fc6ca5a852ab50daea"} Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.325533 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:28 crc kubenswrapper[4965]: E0318 11:58:28.327483 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 11:58:28.827471611 +0000 UTC m=+113.813659090 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmt4c" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.356020 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rwvcf" event={"ID":"ae68bf3c-11de-49ed-8fc2-a669cb8a04fa","Type":"ContainerStarted","Data":"da48b5d5aae8b1859c57c67b3447739ccde310907be30993f1c39c566133fcbf"} Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.368887 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w9kwc" event={"ID":"ea7fccc2-5160-43d1-bbfa-7c6905d4a0d8","Type":"ContainerStarted","Data":"b69dc16c38f0a07f1977a6a3ed16ef2bcfbf4c4709c3a6b1203b560d70c37476"} Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.382822 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-5scj2" event={"ID":"ccc8d88d-863a-43b2-916d-b9df5a38453d","Type":"ContainerStarted","Data":"3f72bea0b0230bd5d6e3e34312b81820002f622700a1b0b20f09e43e4a37e2ad"} Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.383331 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-5scj2" Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.388915 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-4sskq" event={"ID":"f98952eb-6e5b-4114-abf8-f440a0c97b76","Type":"ContainerStarted","Data":"b1c2570481a8fedbd5a8328011ca702f3a409701836f3bc60a1af5254a731a46"} Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.395095 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-sw5p7" event={"ID":"29015bb8-e604-425d-a88b-db3ec9e10096","Type":"ContainerStarted","Data":"9437442281bd48cf1ed1577080310b713dba0901a90f72916bce9c169219638b"} Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.395204 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-sw5p7" Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.395214 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-sw5p7" event={"ID":"29015bb8-e604-425d-a88b-db3ec9e10096","Type":"ContainerStarted","Data":"2fe8dcd9b2d990422de9e6ea4ee1c7cede7f481e3c0b96f2cf74bf78be0b088a"} Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.409892 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wszrt" podStartSLOduration=52.409874752 podStartE2EDuration="52.409874752s" podCreationTimestamp="2026-03-18 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:28.40943711 +0000 UTC m=+113.395624589" watchObservedRunningTime="2026-03-18 11:58:28.409874752 +0000 UTC m=+113.396062231" Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.423911 4965 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-sw5p7 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.423963 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-sw5p7" podUID="29015bb8-e604-425d-a88b-db3ec9e10096" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.430463 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:28 crc kubenswrapper[4965]: E0318 11:58:28.430798 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:28.930777228 +0000 UTC m=+113.916964707 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.445601 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-6w7zg" podStartSLOduration=52.445576835 podStartE2EDuration="52.445576835s" podCreationTimestamp="2026-03-18 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:28.444979899 +0000 UTC m=+113.431167378" watchObservedRunningTime="2026-03-18 11:58:28.445576835 +0000 UTC m=+113.431764314" Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.478948 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-5scj2" Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.533899 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:28 crc kubenswrapper[4965]: E0318 11:58:28.562896 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 11:58:29.062851387 +0000 UTC m=+114.049038866 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmt4c" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.585331 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5x9mq"] Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.585584 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sw5p7"] Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.634484 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9jx9z"] Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.635107 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:28 crc kubenswrapper[4965]: E0318 11:58:28.635564 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:29.13554559 +0000 UTC m=+114.121733069 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.737034 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.737428 4965 patch_prober.go:28] interesting pod/router-default-5444994796-xhsv5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 11:58:28 crc kubenswrapper[4965]: [-]has-synced failed: reason withheld Mar 18 11:58:28 crc kubenswrapper[4965]: [+]process-running ok Mar 18 11:58:28 crc kubenswrapper[4965]: healthz check failed Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.737459 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhsv5" podUID="ea2b4f76-5f61-410a-bb32-c1185e630c7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 11:58:28 crc kubenswrapper[4965]: E0318 11:58:28.737465 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 11:58:29.237443007 +0000 UTC m=+114.223630536 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmt4c" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.802302 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-zptvt" podStartSLOduration=51.802283004 podStartE2EDuration="51.802283004s" podCreationTimestamp="2026-03-18 11:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:28.800857325 +0000 UTC m=+113.787044804" watchObservedRunningTime="2026-03-18 11:58:28.802283004 +0000 UTC m=+113.788470483" Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.839523 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:28 crc kubenswrapper[4965]: E0318 11:58:28.840054 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:29.340035924 +0000 UTC m=+114.326223403 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.871120 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-5scj2" podStartSLOduration=6.87109683 podStartE2EDuration="6.87109683s" podCreationTimestamp="2026-03-18 11:58:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:28.865953648 +0000 UTC m=+113.852141127" watchObservedRunningTime="2026-03-18 11:58:28.87109683 +0000 UTC m=+113.857284309" Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.899235 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-llx6v" podStartSLOduration=51.899219865 podStartE2EDuration="51.899219865s" podCreationTimestamp="2026-03-18 11:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:28.896937312 +0000 UTC m=+113.883124791" watchObservedRunningTime="2026-03-18 11:58:28.899219865 +0000 UTC m=+113.885407344" Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.941215 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:28 crc kubenswrapper[4965]: E0318 11:58:28.942260 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 11:58:29.44224719 +0000 UTC m=+114.428434659 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmt4c" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:28 crc kubenswrapper[4965]: I0318 11:58:28.965202 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-sw5p7" podStartSLOduration=52.965183682 podStartE2EDuration="52.965183682s" podCreationTimestamp="2026-03-18 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:28.959324501 +0000 UTC m=+113.945511990" watchObservedRunningTime="2026-03-18 11:58:28.965183682 +0000 UTC m=+113.951371161" Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.034105 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dddvm" podStartSLOduration=52.034084531 podStartE2EDuration="52.034084531s" podCreationTimestamp="2026-03-18 11:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:29.032140787 +0000 UTC m=+114.018328266" watchObservedRunningTime="2026-03-18 11:58:29.034084531 +0000 UTC m=+114.020272030" Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.034437 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-jvz5t" podStartSLOduration=52.03443069 podStartE2EDuration="52.03443069s" podCreationTimestamp="2026-03-18 11:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:28.985345208 +0000 UTC m=+113.971532687" watchObservedRunningTime="2026-03-18 11:58:29.03443069 +0000 UTC m=+114.020618169" Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.043113 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:29 crc kubenswrapper[4965]: E0318 11:58:29.043300 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:29.543281604 +0000 UTC m=+114.529469073 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.043333 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:29 crc kubenswrapper[4965]: E0318 11:58:29.043900 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 11:58:29.543884301 +0000 UTC m=+114.530071780 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmt4c" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.061604 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-cmx7r" podStartSLOduration=7.061584479 podStartE2EDuration="7.061584479s" podCreationTimestamp="2026-03-18 11:58:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:29.05980912 +0000 UTC m=+114.045996599" watchObservedRunningTime="2026-03-18 11:58:29.061584479 +0000 UTC m=+114.047771958" Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.103306 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wbdxj" podStartSLOduration=53.103289138 podStartE2EDuration="53.103289138s" podCreationTimestamp="2026-03-18 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:29.102205608 +0000 UTC m=+114.088393087" watchObservedRunningTime="2026-03-18 11:58:29.103289138 +0000 UTC m=+114.089476617" Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.144899 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:29 crc kubenswrapper[4965]: E0318 11:58:29.145051 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:29.645025218 +0000 UTC m=+114.631212697 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.146548 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:29 crc kubenswrapper[4965]: E0318 11:58:29.146991 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 11:58:29.646959601 +0000 UTC m=+114.633147090 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmt4c" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.247645 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:29 crc kubenswrapper[4965]: E0318 11:58:29.248137 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:29.748116408 +0000 UTC m=+114.734303897 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.248408 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:29 crc kubenswrapper[4965]: E0318 11:58:29.248729 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 11:58:29.748717015 +0000 UTC m=+114.734904494 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmt4c" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.349822 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:29 crc kubenswrapper[4965]: E0318 11:58:29.350280 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:29.850262173 +0000 UTC m=+114.836449652 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.414176 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" event={"ID":"8aa829e4-03d9-4359-9d6e-a7ce76a2072b","Type":"ContainerStarted","Data":"4689c3919008d0d543f0fe1f20eefbade43853d2d91b298cf743a808bb1ae4a9"} Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.414577 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.417234 4965 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-sk66r container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.18:6443/healthz\": dial tcp 10.217.0.18:6443: connect: connection refused" start-of-body= Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.417287 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" podUID="8aa829e4-03d9-4359-9d6e-a7ce76a2072b" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.18:6443/healthz\": dial tcp 10.217.0.18:6443: connect: connection refused" Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.419231 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dddvm" event={"ID":"8d01f52f-de93-4fa8-80c2-88b21f4e6400","Type":"ContainerStarted","Data":"43cb96f12b24877b2f3f9246f32a4c523d4ae3fa37bba47b791407e7708ef275"} Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.420369 4965 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-dddvm container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:5443/healthz\": dial tcp 10.217.0.31:5443: connect: connection refused" start-of-body= Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.420406 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dddvm" podUID="8d01f52f-de93-4fa8-80c2-88b21f4e6400" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.31:5443/healthz\": dial tcp 10.217.0.31:5443: connect: connection refused" Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.422236 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w9kwc" event={"ID":"ea7fccc2-5160-43d1-bbfa-7c6905d4a0d8","Type":"ContainerStarted","Data":"93e40aa16b6e7217767c439e8b9531c28f4c9a90a916e1dae9bb493b8755773d"} Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.433307 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-4sskq" event={"ID":"f98952eb-6e5b-4114-abf8-f440a0c97b76","Type":"ContainerStarted","Data":"323361c457a2f3c42c6d2aa3a447150c1cb341fc69ed2c9331ac81d1de0dbfc4"} Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.454274 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.454518 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" podStartSLOduration=53.454499065 podStartE2EDuration="53.454499065s" podCreationTimestamp="2026-03-18 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:29.453289531 +0000 UTC m=+114.439477020" watchObservedRunningTime="2026-03-18 11:58:29.454499065 +0000 UTC m=+114.440686544" Mar 18 11:58:29 crc kubenswrapper[4965]: E0318 11:58:29.454790 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 11:58:29.954766982 +0000 UTC m=+114.940954461 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmt4c" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.462993 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kwbct" event={"ID":"6c073d0e-ac3a-4cb8-a611-8e5e0f99c3e9","Type":"ContainerStarted","Data":"053dac0583b803ed3b52f3491d4e8ab996bd6f94f3320f07edbe44abe25f5560"} Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.463043 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kwbct" event={"ID":"6c073d0e-ac3a-4cb8-a611-8e5e0f99c3e9","Type":"ContainerStarted","Data":"c2f3f9cd4c651f65374a76c96301edb6b401f03be83e731340ff145a95842621"} Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.476719 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8xpr" event={"ID":"8f3fbc5c-1a5c-408c-acc0-648113e6e9bc","Type":"ContainerStarted","Data":"8f5e6cc77bc67d9dc6fc2bf4076856728558e9b2571175babaa203aeb17401d8"} Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.476794 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8xpr" event={"ID":"8f3fbc5c-1a5c-408c-acc0-648113e6e9bc","Type":"ContainerStarted","Data":"046d7ea4315e67f1ce3f0e4a371dda5ff2d24de1a9f7dcedad3c550499b61a9b"} Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.501043 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8pdbx" event={"ID":"beb408f9-638c-4bd9-b4ec-c72bc43286e3","Type":"ContainerStarted","Data":"6141c19bfb5535d30bd817ac8fff790599c5ef472e9635ef6ff384b0a8294bda"} Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.502171 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-8pdbx" Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.504386 4965 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8pdbx container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/healthz\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.504424 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8pdbx" podUID="beb408f9-638c-4bd9-b4ec-c72bc43286e3" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.20:8080/healthz\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.506415 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-4sskq" podStartSLOduration=52.506404355 podStartE2EDuration="52.506404355s" podCreationTimestamp="2026-03-18 11:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:29.477457657 +0000 UTC m=+114.463645136" watchObservedRunningTime="2026-03-18 11:58:29.506404355 +0000 UTC m=+114.492591834" Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.520562 4965 generic.go:334] "Generic (PLEG): container finished" podID="6cfd1d38-a35c-47ab-962f-4403875e5a19" containerID="579d29b1e2736df088bc582323a9a7d614eec2fba479e3986543ac20727aafb3" exitCode=0 Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.521548 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-mmzsn" event={"ID":"6cfd1d38-a35c-47ab-962f-4403875e5a19","Type":"ContainerDied","Data":"579d29b1e2736df088bc582323a9a7d614eec2fba479e3986543ac20727aafb3"} Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.538832 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-q4h66" event={"ID":"fb8ab568-8d37-4899-9337-600b4f41dfe5","Type":"ContainerStarted","Data":"4b025bda1a41a7c3f4ed596cc39b8def6d197d37a8cdf4b341f575692f6c7e1b"} Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.542140 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kwbct" podStartSLOduration=52.542119679 podStartE2EDuration="52.542119679s" podCreationTimestamp="2026-03-18 11:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:29.512366309 +0000 UTC m=+114.498553838" watchObservedRunningTime="2026-03-18 11:58:29.542119679 +0000 UTC m=+114.528307158" Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.543884 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8xpr" podStartSLOduration=53.543875437 podStartE2EDuration="53.543875437s" podCreationTimestamp="2026-03-18 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:29.541369168 +0000 UTC m=+114.527556647" watchObservedRunningTime="2026-03-18 11:58:29.543875437 +0000 UTC m=+114.530062926" Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.548170 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fs56m" event={"ID":"8b27563b-d715-4dfa-84b6-5d0f1d90e4b5","Type":"ContainerStarted","Data":"5b60dbe59d9b92cf3d0e76a23a55858474874f39ecf6e07ce392394a23a67424"} Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.548896 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fs56m" Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.558155 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:29 crc kubenswrapper[4965]: E0318 11:58:29.559618 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:30.059596021 +0000 UTC m=+115.045783500 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.564200 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nb2r4" event={"ID":"98ebf2d2-03be-474a-a318-0e91e9164758","Type":"ContainerStarted","Data":"b74b872c72ab0811f62f1a57f8651912a87228b83538b0d61e4813684c4dc21c"} Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.564606 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nb2r4" Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.568774 4965 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-nb2r4 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.568828 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nb2r4" podUID="98ebf2d2-03be-474a-a318-0e91e9164758" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.594765 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wg6wd" event={"ID":"10476ddd-0e25-4297-8e49-ae3cb9d2d0c4","Type":"ContainerStarted","Data":"357f7c869c50d6f36c56b7e039992d67e8c5ed9d3fccf6b801631403056de423"} Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.598530 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-fgg7z" event={"ID":"c3e59a9f-c0ea-4a0e-a39a-f332e3eba1a4","Type":"ContainerStarted","Data":"6c901f937b374b5207148bae8736dae406bf647b191f1de4013e658905d02227"} Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.602681 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-q4h66" podStartSLOduration=53.602644357 podStartE2EDuration="53.602644357s" podCreationTimestamp="2026-03-18 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:29.602375809 +0000 UTC m=+114.588563298" watchObservedRunningTime="2026-03-18 11:58:29.602644357 +0000 UTC m=+114.588831836" Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.605248 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-8pdbx" podStartSLOduration=52.605241428 podStartE2EDuration="52.605241428s" podCreationTimestamp="2026-03-18 11:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:29.57917311 +0000 UTC m=+114.565360589" watchObservedRunningTime="2026-03-18 11:58:29.605241428 +0000 UTC m=+114.591428907" Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.646733 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b4nfn" event={"ID":"89a368d8-7c06-47c3-9a5f-0c5cc5a5cd47","Type":"ContainerStarted","Data":"098647b3674f7090fa83fba9770c5615bc92802a8e5d9a2073f7808f78ebbf9d"} Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.663432 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:29 crc kubenswrapper[4965]: E0318 11:58:29.678065 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 11:58:30.178050534 +0000 UTC m=+115.164238013 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmt4c" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.699405 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bksfk" event={"ID":"19d4f554-489c-46b5-aba4-919ab05d8853","Type":"ContainerStarted","Data":"3e2d4f0ef3fb21d061c5909c3a337a13af2cb72bed111b371a237185e3960693"} Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.699443 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bksfk" event={"ID":"19d4f554-489c-46b5-aba4-919ab05d8853","Type":"ContainerStarted","Data":"757b6d68bd7ffed898e3d0e9588b8df66ff5b06f30a711db41fb79b5068f8107"} Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.699458 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bksfk" Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.731787 4965 patch_prober.go:28] interesting pod/router-default-5444994796-xhsv5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 11:58:29 crc kubenswrapper[4965]: [-]has-synced failed: reason withheld Mar 18 11:58:29 crc kubenswrapper[4965]: [+]process-running ok Mar 18 11:58:29 crc kubenswrapper[4965]: healthz check failed Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.731835 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhsv5" podUID="ea2b4f76-5f61-410a-bb32-c1185e630c7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.746947 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vfp9m" event={"ID":"fefb79e0-1f20-4a4f-bdf2-f3bad9d2a7c3","Type":"ContainerStarted","Data":"7d9a9f7d8a9c00a784a500758cac65bcf046ae8e9b8d419b47dd7ec0fe732c74"} Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.767360 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:29 crc kubenswrapper[4965]: E0318 11:58:29.768581 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:30.268565708 +0000 UTC m=+115.254753187 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.788291 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qdbck" event={"ID":"7a9e071f-3977-4a85-9b66-30c59efd7d3a","Type":"ContainerStarted","Data":"94b6de08f7d83eae81c7ae31b492aee3c6d3038ead2d7fa0c3323fbdde84f4af"} Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.806882 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4xx64" event={"ID":"ab0e8f3c-d83a-4b3a-aee7-f530f82761f8","Type":"ContainerStarted","Data":"25ff2a5e0a9eaaea1bfd607ad285f5900249fcd5c1c53c972a4b8d3d52bb15ce"} Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.841442 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5x9mq" event={"ID":"2c56d894-8d33-4682-b8ea-4d99fc3bb1b7","Type":"ContainerStarted","Data":"f25e47fc582582ccdc26cb83b5e52fed75080a556d66c2e4f787bbc8084c7cca"} Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.842431 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5x9mq" Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.850295 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fs56m" podStartSLOduration=53.85027904 podStartE2EDuration="53.85027904s" podCreationTimestamp="2026-03-18 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:29.848170952 +0000 UTC m=+114.834358431" watchObservedRunningTime="2026-03-18 11:58:29.85027904 +0000 UTC m=+114.836466519" Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.864899 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-jvz5t" event={"ID":"7f9a78ea-4484-46dd-ab80-ac76fc089420","Type":"ContainerStarted","Data":"4b45e8ed1bf12e7261cee55798e654e846a1bf0ecbe5b51c5264dce85640de9d"} Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.873156 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:29 crc kubenswrapper[4965]: E0318 11:58:29.875373 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 11:58:30.375361601 +0000 UTC m=+115.361549080 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmt4c" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.908919 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-q9zxt" event={"ID":"18b96e51-1a8b-4947-a9a3-9af0c0e88100","Type":"ContainerStarted","Data":"6fe13c90c937cbec7b906ef2f64870ecf8f746dd9da60caf907fc5d81c0afdb7"} Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.909689 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-q9zxt" Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.925983 4965 patch_prober.go:28] interesting pod/downloads-7954f5f757-q9zxt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.926055 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-q9zxt" podUID="18b96e51-1a8b-4947-a9a3-9af0c0e88100" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.935569 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2dm7f" event={"ID":"e712d1d3-e146-41c2-a5ef-610e4bb8771c","Type":"ContainerStarted","Data":"6a8f885f38401dde100b1ba01e0a89a113c187c10aed26a27c6a920ab616a8db"} Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.935623 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2dm7f" event={"ID":"e712d1d3-e146-41c2-a5ef-610e4bb8771c","Type":"ContainerStarted","Data":"21f332d5b9c560a21f7a010a197413287c15d5f658f6ca7c6818bf893ef351ec"} Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.974097 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:29 crc kubenswrapper[4965]: E0318 11:58:29.976267 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:30.476247141 +0000 UTC m=+115.462434620 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.991670 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rwvcf" event={"ID":"ae68bf3c-11de-49ed-8fc2-a669cb8a04fa","Type":"ContainerStarted","Data":"a668ea9f96ed4e2fe652085ed402b8b5e2d76917d472ba11da6c928de1d4a63c"} Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.992255 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bksfk" podStartSLOduration=52.992230661 podStartE2EDuration="52.992230661s" podCreationTimestamp="2026-03-18 11:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:29.971195302 +0000 UTC m=+114.957382791" watchObservedRunningTime="2026-03-18 11:58:29.992230661 +0000 UTC m=+114.978418150" Mar 18 11:58:29 crc kubenswrapper[4965]: I0318 11:58:29.992423 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-5scj2"] Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.009433 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cv5z6" event={"ID":"c66fcc55-a5d5-4886-afa8-33dd5be2f631","Type":"ContainerStarted","Data":"7cd139d6c9c6d1022f01e93c51ce2f35abb91971d5aadc7f0f94813f9197b339"} Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.010275 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cv5z6" Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.012358 4965 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-cv5z6 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" start-of-body= Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.012397 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cv5z6" podUID="c66fcc55-a5d5-4886-afa8-33dd5be2f631" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.023770 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nb2r4" podStartSLOduration=53.0237454 podStartE2EDuration="53.0237454s" podCreationTimestamp="2026-03-18 11:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:30.010029202 +0000 UTC m=+114.996216691" watchObservedRunningTime="2026-03-18 11:58:30.0237454 +0000 UTC m=+115.009932879" Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.038321 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-fgg7z" podStartSLOduration=53.038300001 podStartE2EDuration="53.038300001s" podCreationTimestamp="2026-03-18 11:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:30.033973131 +0000 UTC m=+115.020160630" watchObservedRunningTime="2026-03-18 11:58:30.038300001 +0000 UTC m=+115.024487480" Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.043450 4965 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-sw5p7 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.043500 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-sw5p7" podUID="29015bb8-e604-425d-a88b-db3ec9e10096" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.048054 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5x9mq" Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.048086 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9jx9z" event={"ID":"1b676c03-201d-403c-8082-84451760c106","Type":"ContainerStarted","Data":"b657164956d48bc3507f3ef49d0642fea6994fc70f5a399319465f77e558e99a"} Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.048109 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9jx9z" event={"ID":"1b676c03-201d-403c-8082-84451760c106","Type":"ContainerStarted","Data":"cb80960eb2fa25ee496f6890777db7811aea7058ed7a35820bfb0e314499ff11"} Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.075465 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:30 crc kubenswrapper[4965]: E0318 11:58:30.077719 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 11:58:30.577707146 +0000 UTC m=+115.563894625 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmt4c" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.122562 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wg6wd" podStartSLOduration=54.122545762 podStartE2EDuration="54.122545762s" podCreationTimestamp="2026-03-18 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:30.121329998 +0000 UTC m=+115.107517467" watchObservedRunningTime="2026-03-18 11:58:30.122545762 +0000 UTC m=+115.108733241" Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.124182 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b4nfn" podStartSLOduration=53.124175897 podStartE2EDuration="53.124175897s" podCreationTimestamp="2026-03-18 11:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:30.081937253 +0000 UTC m=+115.068124732" watchObservedRunningTime="2026-03-18 11:58:30.124175897 +0000 UTC m=+115.110363376" Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.157363 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2dm7f" podStartSLOduration=53.157343951 podStartE2EDuration="53.157343951s" podCreationTimestamp="2026-03-18 11:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:30.155936412 +0000 UTC m=+115.142123891" watchObservedRunningTime="2026-03-18 11:58:30.157343951 +0000 UTC m=+115.143531440" Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.176553 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:30 crc kubenswrapper[4965]: E0318 11:58:30.178697 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:30.678672458 +0000 UTC m=+115.664859937 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.196271 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5x9mq" podStartSLOduration=53.196254623 podStartE2EDuration="53.196254623s" podCreationTimestamp="2026-03-18 11:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:30.195249245 +0000 UTC m=+115.181436724" watchObservedRunningTime="2026-03-18 11:58:30.196254623 +0000 UTC m=+115.182442112" Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.233363 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rwvcf" podStartSLOduration=54.233343745 podStartE2EDuration="54.233343745s" podCreationTimestamp="2026-03-18 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:30.233044106 +0000 UTC m=+115.219231585" watchObservedRunningTime="2026-03-18 11:58:30.233343745 +0000 UTC m=+115.219531224" Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.272866 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vfp9m" podStartSLOduration=53.272847523 podStartE2EDuration="53.272847523s" podCreationTimestamp="2026-03-18 11:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:30.270813527 +0000 UTC m=+115.257001016" watchObservedRunningTime="2026-03-18 11:58:30.272847523 +0000 UTC m=+115.259035002" Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.278757 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:30 crc kubenswrapper[4965]: E0318 11:58:30.279132 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 11:58:30.779117106 +0000 UTC m=+115.765304585 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmt4c" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.301523 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cv5z6" podStartSLOduration=53.301502353 podStartE2EDuration="53.301502353s" podCreationTimestamp="2026-03-18 11:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:30.300360421 +0000 UTC m=+115.286547900" watchObservedRunningTime="2026-03-18 11:58:30.301502353 +0000 UTC m=+115.287689832" Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.348401 4965 ???:1] "http: TLS handshake error from 192.168.126.11:42456: no serving certificate available for the kubelet" Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.361449 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qdbck" podStartSLOduration=53.361431694 podStartE2EDuration="53.361431694s" podCreationTimestamp="2026-03-18 11:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:30.358483903 +0000 UTC m=+115.344671392" watchObservedRunningTime="2026-03-18 11:58:30.361431694 +0000 UTC m=+115.347619173" Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.380208 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:30 crc kubenswrapper[4965]: E0318 11:58:30.380707 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:30.880691565 +0000 UTC m=+115.866879044 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.396520 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-q9zxt" podStartSLOduration=54.39650518 podStartE2EDuration="54.39650518s" podCreationTimestamp="2026-03-18 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:30.395460382 +0000 UTC m=+115.381647861" watchObservedRunningTime="2026-03-18 11:58:30.39650518 +0000 UTC m=+115.382692659" Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.423195 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-4xx64" podStartSLOduration=8.423176394 podStartE2EDuration="8.423176394s" podCreationTimestamp="2026-03-18 11:58:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:30.422018612 +0000 UTC m=+115.408206091" watchObservedRunningTime="2026-03-18 11:58:30.423176394 +0000 UTC m=+115.409363863" Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.435123 4965 ???:1] "http: TLS handshake error from 192.168.126.11:42458: no serving certificate available for the kubelet" Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.471935 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qdbck" Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.471981 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qdbck" Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.480804 4965 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-qdbck container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.9:8443/livez\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.480871 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qdbck" podUID="7a9e071f-3977-4a85-9b66-30c59efd7d3a" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.9:8443/livez\": dial tcp 10.217.0.9:8443: connect: connection refused" Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.481905 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:30 crc kubenswrapper[4965]: E0318 11:58:30.482259 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 11:58:30.982246292 +0000 UTC m=+115.968433771 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmt4c" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.529876 4965 ???:1] "http: TLS handshake error from 192.168.126.11:42474: no serving certificate available for the kubelet" Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.560899 4965 ???:1] "http: TLS handshake error from 192.168.126.11:42486: no serving certificate available for the kubelet" Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.584203 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:30 crc kubenswrapper[4965]: E0318 11:58:30.584565 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:31.08451975 +0000 UTC m=+116.070707239 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.584648 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:30 crc kubenswrapper[4965]: E0318 11:58:30.585051 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 11:58:31.085039014 +0000 UTC m=+116.071226493 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmt4c" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.666151 4965 ???:1] "http: TLS handshake error from 192.168.126.11:42500: no serving certificate available for the kubelet" Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.685901 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:30 crc kubenswrapper[4965]: E0318 11:58:30.686323 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:31.186292124 +0000 UTC m=+116.172479743 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.686480 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:30 crc kubenswrapper[4965]: E0318 11:58:30.686798 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 11:58:31.186785208 +0000 UTC m=+116.172972687 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmt4c" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.729421 4965 patch_prober.go:28] interesting pod/router-default-5444994796-xhsv5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 11:58:30 crc kubenswrapper[4965]: [-]has-synced failed: reason withheld Mar 18 11:58:30 crc kubenswrapper[4965]: [+]process-running ok Mar 18 11:58:30 crc kubenswrapper[4965]: healthz check failed Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.729485 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhsv5" podUID="ea2b4f76-5f61-410a-bb32-c1185e630c7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.756798 4965 ???:1] "http: TLS handshake error from 192.168.126.11:42506: no serving certificate available for the kubelet" Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.787471 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:30 crc kubenswrapper[4965]: E0318 11:58:30.787686 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:31.287649747 +0000 UTC m=+116.273837226 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.787903 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:30 crc kubenswrapper[4965]: E0318 11:58:30.788273 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 11:58:31.288260474 +0000 UTC m=+116.274447953 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmt4c" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.849993 4965 ???:1] "http: TLS handshake error from 192.168.126.11:42520: no serving certificate available for the kubelet" Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.888862 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:30 crc kubenswrapper[4965]: E0318 11:58:30.889049 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:31.38902378 +0000 UTC m=+116.375211259 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.889079 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:30 crc kubenswrapper[4965]: E0318 11:58:30.889381 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 11:58:31.38937382 +0000 UTC m=+116.375561299 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmt4c" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.990235 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:30 crc kubenswrapper[4965]: E0318 11:58:30.990406 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:31.490377123 +0000 UTC m=+116.476564602 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:30 crc kubenswrapper[4965]: I0318 11:58:30.990500 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:30 crc kubenswrapper[4965]: E0318 11:58:30.990837 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 11:58:31.490825145 +0000 UTC m=+116.477012614 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmt4c" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.023397 4965 ???:1] "http: TLS handshake error from 192.168.126.11:42524: no serving certificate available for the kubelet" Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.049280 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xjqrf" event={"ID":"047424fe-da2d-4da3-a8b4-ff9bc5ae743f","Type":"ContainerStarted","Data":"53c34869decdf66cfcbf3e67cd26b67d66cab076a32a9bc57f1e09ffeb9a8d7e"} Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.051440 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w9kwc" event={"ID":"ea7fccc2-5160-43d1-bbfa-7c6905d4a0d8","Type":"ContainerStarted","Data":"8e112fe675006ee37074a9e0a4d1d4cc2b7a54f76b97c332a5ab491ee31ac712"} Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.051544 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-w9kwc" Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.056320 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-mmzsn" event={"ID":"6cfd1d38-a35c-47ab-962f-4403875e5a19","Type":"ContainerStarted","Data":"cebe8591793c8f0c7501ed7615910aa045bf4d6ebbee2ddfd3d006386388f072"} Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.056359 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-mmzsn" event={"ID":"6cfd1d38-a35c-47ab-962f-4403875e5a19","Type":"ContainerStarted","Data":"89e75963015ba37e8557b526b41eb29a5a2d95f9c650b55abc79f019a2f523dc"} Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.058406 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9jx9z" event={"ID":"1b676c03-201d-403c-8082-84451760c106","Type":"ContainerStarted","Data":"47f24216313a60aee8cea7832d1d6239bb16333c10f68861368418f8855f9f2c"} Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.059195 4965 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8pdbx container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/healthz\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.059266 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8pdbx" podUID="beb408f9-638c-4bd9-b4ec-c72bc43286e3" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.20:8080/healthz\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.059779 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5x9mq" podUID="2c56d894-8d33-4682-b8ea-4d99fc3bb1b7" containerName="route-controller-manager" containerID="cri-o://f25e47fc582582ccdc26cb83b5e52fed75080a556d66c2e4f787bbc8084c7cca" gracePeriod=30 Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.060121 4965 patch_prober.go:28] interesting pod/downloads-7954f5f757-q9zxt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.060162 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-q9zxt" podUID="18b96e51-1a8b-4947-a9a3-9af0c0e88100" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.061589 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-5scj2" podUID="ccc8d88d-863a-43b2-916d-b9df5a38453d" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://3f72bea0b0230bd5d6e3e34312b81820002f622700a1b0b20f09e43e4a37e2ad" gracePeriod=30 Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.062587 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-sw5p7" podUID="29015bb8-e604-425d-a88b-db3ec9e10096" containerName="controller-manager" containerID="cri-o://9437442281bd48cf1ed1577080310b713dba0901a90f72916bce9c169219638b" gracePeriod=30 Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.065102 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-sw5p7" Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.073056 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dddvm" Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.077249 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cv5z6" Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.080365 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nb2r4" Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.087994 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.091759 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:31 crc kubenswrapper[4965]: E0318 11:58:31.091920 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:31.59190038 +0000 UTC m=+116.578087859 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.092102 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:31 crc kubenswrapper[4965]: E0318 11:58:31.092422 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 11:58:31.592414824 +0000 UTC m=+116.578602303 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmt4c" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.107246 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-w9kwc" podStartSLOduration=9.107229702 podStartE2EDuration="9.107229702s" podCreationTimestamp="2026-03-18 11:58:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:31.102521303 +0000 UTC m=+116.088708782" watchObservedRunningTime="2026-03-18 11:58:31.107229702 +0000 UTC m=+116.093417181" Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.198337 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:31 crc kubenswrapper[4965]: E0318 11:58:31.200251 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:31.700231415 +0000 UTC m=+116.686418894 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.303137 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:31 crc kubenswrapper[4965]: E0318 11:58:31.303851 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 11:58:31.80383567 +0000 UTC m=+116.790023149 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmt4c" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.404278 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:31 crc kubenswrapper[4965]: E0318 11:58:31.406357 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:31.906329923 +0000 UTC m=+116.892517412 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.422598 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-mmzsn" podStartSLOduration=55.422558771 podStartE2EDuration="55.422558771s" podCreationTimestamp="2026-03-18 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:31.406113818 +0000 UTC m=+116.392301297" watchObservedRunningTime="2026-03-18 11:58:31.422558771 +0000 UTC m=+116.408746250" Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.423253 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-9jx9z" podStartSLOduration=55.42324532 podStartE2EDuration="55.42324532s" podCreationTimestamp="2026-03-18 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:31.291451228 +0000 UTC m=+116.277638707" watchObservedRunningTime="2026-03-18 11:58:31.42324532 +0000 UTC m=+116.409432799" Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.423882 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fs56m" Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.517129 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:31 crc kubenswrapper[4965]: E0318 11:58:31.517565 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 11:58:32.017550118 +0000 UTC m=+117.003737607 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmt4c" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.622421 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:31 crc kubenswrapper[4965]: E0318 11:58:31.622557 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:32.122534941 +0000 UTC m=+117.108722430 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.622785 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:31 crc kubenswrapper[4965]: E0318 11:58:31.623224 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 11:58:32.123212479 +0000 UTC m=+117.109399958 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmt4c" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.711034 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-sw5p7" Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.716026 4965 ???:1] "http: TLS handshake error from 192.168.126.11:42536: no serving certificate available for the kubelet" Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.724201 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:31 crc kubenswrapper[4965]: E0318 11:58:31.724583 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:32.224562702 +0000 UTC m=+117.210750181 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.734029 4965 patch_prober.go:28] interesting pod/router-default-5444994796-xhsv5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 11:58:31 crc kubenswrapper[4965]: [-]has-synced failed: reason withheld Mar 18 11:58:31 crc kubenswrapper[4965]: [+]process-running ok Mar 18 11:58:31 crc kubenswrapper[4965]: healthz check failed Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.734072 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhsv5" podUID="ea2b4f76-5f61-410a-bb32-c1185e630c7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.755143 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-768f74964f-q9448"] Mar 18 11:58:31 crc kubenswrapper[4965]: E0318 11:58:31.755379 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29015bb8-e604-425d-a88b-db3ec9e10096" containerName="controller-manager" Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.755395 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="29015bb8-e604-425d-a88b-db3ec9e10096" containerName="controller-manager" Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.755506 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="29015bb8-e604-425d-a88b-db3ec9e10096" containerName="controller-manager" Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.755930 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-768f74964f-q9448" Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.769236 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-768f74964f-q9448"] Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.824859 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x98tf\" (UniqueName: \"kubernetes.io/projected/29015bb8-e604-425d-a88b-db3ec9e10096-kube-api-access-x98tf\") pod \"29015bb8-e604-425d-a88b-db3ec9e10096\" (UID: \"29015bb8-e604-425d-a88b-db3ec9e10096\") " Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.824922 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/29015bb8-e604-425d-a88b-db3ec9e10096-proxy-ca-bundles\") pod \"29015bb8-e604-425d-a88b-db3ec9e10096\" (UID: \"29015bb8-e604-425d-a88b-db3ec9e10096\") " Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.824957 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/29015bb8-e604-425d-a88b-db3ec9e10096-client-ca\") pod \"29015bb8-e604-425d-a88b-db3ec9e10096\" (UID: \"29015bb8-e604-425d-a88b-db3ec9e10096\") " Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.824995 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29015bb8-e604-425d-a88b-db3ec9e10096-serving-cert\") pod \"29015bb8-e604-425d-a88b-db3ec9e10096\" (UID: \"29015bb8-e604-425d-a88b-db3ec9e10096\") " Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.825026 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29015bb8-e604-425d-a88b-db3ec9e10096-config\") pod \"29015bb8-e604-425d-a88b-db3ec9e10096\" (UID: \"29015bb8-e604-425d-a88b-db3ec9e10096\") " Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.825310 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:31 crc kubenswrapper[4965]: E0318 11:58:31.825641 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 11:58:32.325628097 +0000 UTC m=+117.311815576 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmt4c" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.826347 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29015bb8-e604-425d-a88b-db3ec9e10096-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "29015bb8-e604-425d-a88b-db3ec9e10096" (UID: "29015bb8-e604-425d-a88b-db3ec9e10096"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.826790 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29015bb8-e604-425d-a88b-db3ec9e10096-client-ca" (OuterVolumeSpecName: "client-ca") pod "29015bb8-e604-425d-a88b-db3ec9e10096" (UID: "29015bb8-e604-425d-a88b-db3ec9e10096"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.828371 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29015bb8-e604-425d-a88b-db3ec9e10096-config" (OuterVolumeSpecName: "config") pod "29015bb8-e604-425d-a88b-db3ec9e10096" (UID: "29015bb8-e604-425d-a88b-db3ec9e10096"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.838544 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29015bb8-e604-425d-a88b-db3ec9e10096-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "29015bb8-e604-425d-a88b-db3ec9e10096" (UID: "29015bb8-e604-425d-a88b-db3ec9e10096"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.839982 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29015bb8-e604-425d-a88b-db3ec9e10096-kube-api-access-x98tf" (OuterVolumeSpecName: "kube-api-access-x98tf") pod "29015bb8-e604-425d-a88b-db3ec9e10096" (UID: "29015bb8-e604-425d-a88b-db3ec9e10096"). InnerVolumeSpecName "kube-api-access-x98tf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.897410 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5x9mq" Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.929623 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.929875 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/41c127cf-02c5-45cd-bf65-5015dd07bd7f-proxy-ca-bundles\") pod \"controller-manager-768f74964f-q9448\" (UID: \"41c127cf-02c5-45cd-bf65-5015dd07bd7f\") " pod="openshift-controller-manager/controller-manager-768f74964f-q9448" Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.929934 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41c127cf-02c5-45cd-bf65-5015dd07bd7f-config\") pod \"controller-manager-768f74964f-q9448\" (UID: \"41c127cf-02c5-45cd-bf65-5015dd07bd7f\") " pod="openshift-controller-manager/controller-manager-768f74964f-q9448" Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.929966 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vww9\" (UniqueName: \"kubernetes.io/projected/41c127cf-02c5-45cd-bf65-5015dd07bd7f-kube-api-access-5vww9\") pod \"controller-manager-768f74964f-q9448\" (UID: \"41c127cf-02c5-45cd-bf65-5015dd07bd7f\") " pod="openshift-controller-manager/controller-manager-768f74964f-q9448" Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.929991 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41c127cf-02c5-45cd-bf65-5015dd07bd7f-serving-cert\") pod \"controller-manager-768f74964f-q9448\" (UID: \"41c127cf-02c5-45cd-bf65-5015dd07bd7f\") " pod="openshift-controller-manager/controller-manager-768f74964f-q9448" Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.930034 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/41c127cf-02c5-45cd-bf65-5015dd07bd7f-client-ca\") pod \"controller-manager-768f74964f-q9448\" (UID: \"41c127cf-02c5-45cd-bf65-5015dd07bd7f\") " pod="openshift-controller-manager/controller-manager-768f74964f-q9448" Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.930082 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x98tf\" (UniqueName: \"kubernetes.io/projected/29015bb8-e604-425d-a88b-db3ec9e10096-kube-api-access-x98tf\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.930095 4965 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/29015bb8-e604-425d-a88b-db3ec9e10096-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.930103 4965 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/29015bb8-e604-425d-a88b-db3ec9e10096-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.930113 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29015bb8-e604-425d-a88b-db3ec9e10096-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.930121 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29015bb8-e604-425d-a88b-db3ec9e10096-config\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:31 crc kubenswrapper[4965]: E0318 11:58:31.930401 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:32.430385903 +0000 UTC m=+117.416573382 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:31 crc kubenswrapper[4965]: I0318 11:58:31.978849 4965 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.033039 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c56d894-8d33-4682-b8ea-4d99fc3bb1b7-config\") pod \"2c56d894-8d33-4682-b8ea-4d99fc3bb1b7\" (UID: \"2c56d894-8d33-4682-b8ea-4d99fc3bb1b7\") " Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.033263 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c56d894-8d33-4682-b8ea-4d99fc3bb1b7-serving-cert\") pod \"2c56d894-8d33-4682-b8ea-4d99fc3bb1b7\" (UID: \"2c56d894-8d33-4682-b8ea-4d99fc3bb1b7\") " Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.033303 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c56d894-8d33-4682-b8ea-4d99fc3bb1b7-client-ca\") pod \"2c56d894-8d33-4682-b8ea-4d99fc3bb1b7\" (UID: \"2c56d894-8d33-4682-b8ea-4d99fc3bb1b7\") " Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.033334 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnzl5\" (UniqueName: \"kubernetes.io/projected/2c56d894-8d33-4682-b8ea-4d99fc3bb1b7-kube-api-access-cnzl5\") pod \"2c56d894-8d33-4682-b8ea-4d99fc3bb1b7\" (UID: \"2c56d894-8d33-4682-b8ea-4d99fc3bb1b7\") " Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.033535 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/41c127cf-02c5-45cd-bf65-5015dd07bd7f-client-ca\") pod \"controller-manager-768f74964f-q9448\" (UID: \"41c127cf-02c5-45cd-bf65-5015dd07bd7f\") " pod="openshift-controller-manager/controller-manager-768f74964f-q9448" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.033594 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/41c127cf-02c5-45cd-bf65-5015dd07bd7f-proxy-ca-bundles\") pod \"controller-manager-768f74964f-q9448\" (UID: \"41c127cf-02c5-45cd-bf65-5015dd07bd7f\") " pod="openshift-controller-manager/controller-manager-768f74964f-q9448" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.033627 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.033700 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41c127cf-02c5-45cd-bf65-5015dd07bd7f-config\") pod \"controller-manager-768f74964f-q9448\" (UID: \"41c127cf-02c5-45cd-bf65-5015dd07bd7f\") " pod="openshift-controller-manager/controller-manager-768f74964f-q9448" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.033739 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vww9\" (UniqueName: \"kubernetes.io/projected/41c127cf-02c5-45cd-bf65-5015dd07bd7f-kube-api-access-5vww9\") pod \"controller-manager-768f74964f-q9448\" (UID: \"41c127cf-02c5-45cd-bf65-5015dd07bd7f\") " pod="openshift-controller-manager/controller-manager-768f74964f-q9448" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.033769 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41c127cf-02c5-45cd-bf65-5015dd07bd7f-serving-cert\") pod \"controller-manager-768f74964f-q9448\" (UID: \"41c127cf-02c5-45cd-bf65-5015dd07bd7f\") " pod="openshift-controller-manager/controller-manager-768f74964f-q9448" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.034213 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c56d894-8d33-4682-b8ea-4d99fc3bb1b7-config" (OuterVolumeSpecName: "config") pod "2c56d894-8d33-4682-b8ea-4d99fc3bb1b7" (UID: "2c56d894-8d33-4682-b8ea-4d99fc3bb1b7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:32 crc kubenswrapper[4965]: E0318 11:58:32.034481 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 11:58:32.534470961 +0000 UTC m=+117.520658440 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmt4c" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.035160 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/41c127cf-02c5-45cd-bf65-5015dd07bd7f-client-ca\") pod \"controller-manager-768f74964f-q9448\" (UID: \"41c127cf-02c5-45cd-bf65-5015dd07bd7f\") " pod="openshift-controller-manager/controller-manager-768f74964f-q9448" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.035580 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41c127cf-02c5-45cd-bf65-5015dd07bd7f-config\") pod \"controller-manager-768f74964f-q9448\" (UID: \"41c127cf-02c5-45cd-bf65-5015dd07bd7f\") " pod="openshift-controller-manager/controller-manager-768f74964f-q9448" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.035859 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/41c127cf-02c5-45cd-bf65-5015dd07bd7f-proxy-ca-bundles\") pod \"controller-manager-768f74964f-q9448\" (UID: \"41c127cf-02c5-45cd-bf65-5015dd07bd7f\") " pod="openshift-controller-manager/controller-manager-768f74964f-q9448" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.050716 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c56d894-8d33-4682-b8ea-4d99fc3bb1b7-kube-api-access-cnzl5" (OuterVolumeSpecName: "kube-api-access-cnzl5") pod "2c56d894-8d33-4682-b8ea-4d99fc3bb1b7" (UID: "2c56d894-8d33-4682-b8ea-4d99fc3bb1b7"). InnerVolumeSpecName "kube-api-access-cnzl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.053018 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c56d894-8d33-4682-b8ea-4d99fc3bb1b7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2c56d894-8d33-4682-b8ea-4d99fc3bb1b7" (UID: "2c56d894-8d33-4682-b8ea-4d99fc3bb1b7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.053943 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c56d894-8d33-4682-b8ea-4d99fc3bb1b7-client-ca" (OuterVolumeSpecName: "client-ca") pod "2c56d894-8d33-4682-b8ea-4d99fc3bb1b7" (UID: "2c56d894-8d33-4682-b8ea-4d99fc3bb1b7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.055056 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41c127cf-02c5-45cd-bf65-5015dd07bd7f-serving-cert\") pod \"controller-manager-768f74964f-q9448\" (UID: \"41c127cf-02c5-45cd-bf65-5015dd07bd7f\") " pod="openshift-controller-manager/controller-manager-768f74964f-q9448" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.090111 4965 generic.go:334] "Generic (PLEG): container finished" podID="29015bb8-e604-425d-a88b-db3ec9e10096" containerID="9437442281bd48cf1ed1577080310b713dba0901a90f72916bce9c169219638b" exitCode=0 Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.090195 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-sw5p7" event={"ID":"29015bb8-e604-425d-a88b-db3ec9e10096","Type":"ContainerDied","Data":"9437442281bd48cf1ed1577080310b713dba0901a90f72916bce9c169219638b"} Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.090227 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-sw5p7" event={"ID":"29015bb8-e604-425d-a88b-db3ec9e10096","Type":"ContainerDied","Data":"2fe8dcd9b2d990422de9e6ea4ee1c7cede7f481e3c0b96f2cf74bf78be0b088a"} Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.090246 4965 scope.go:117] "RemoveContainer" containerID="9437442281bd48cf1ed1577080310b713dba0901a90f72916bce9c169219638b" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.090399 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-sw5p7" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.095499 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vww9\" (UniqueName: \"kubernetes.io/projected/41c127cf-02c5-45cd-bf65-5015dd07bd7f-kube-api-access-5vww9\") pod \"controller-manager-768f74964f-q9448\" (UID: \"41c127cf-02c5-45cd-bf65-5015dd07bd7f\") " pod="openshift-controller-manager/controller-manager-768f74964f-q9448" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.122501 4965 generic.go:334] "Generic (PLEG): container finished" podID="2c56d894-8d33-4682-b8ea-4d99fc3bb1b7" containerID="f25e47fc582582ccdc26cb83b5e52fed75080a556d66c2e4f787bbc8084c7cca" exitCode=0 Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.122606 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5x9mq" event={"ID":"2c56d894-8d33-4682-b8ea-4d99fc3bb1b7","Type":"ContainerDied","Data":"f25e47fc582582ccdc26cb83b5e52fed75080a556d66c2e4f787bbc8084c7cca"} Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.122635 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5x9mq" event={"ID":"2c56d894-8d33-4682-b8ea-4d99fc3bb1b7","Type":"ContainerDied","Data":"13d30575086fb5f3f678c045140f124f319d05bc36b9983f41d0722132c2755e"} Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.122724 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5x9mq" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.131316 4965 scope.go:117] "RemoveContainer" containerID="9437442281bd48cf1ed1577080310b713dba0901a90f72916bce9c169219638b" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.144430 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.144740 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c56d894-8d33-4682-b8ea-4d99fc3bb1b7-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.144781 4965 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c56d894-8d33-4682-b8ea-4d99fc3bb1b7-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.144793 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnzl5\" (UniqueName: \"kubernetes.io/projected/2c56d894-8d33-4682-b8ea-4d99fc3bb1b7-kube-api-access-cnzl5\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.144804 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c56d894-8d33-4682-b8ea-4d99fc3bb1b7-config\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:32 crc kubenswrapper[4965]: E0318 11:58:32.144875 4965 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 11:58:32.644857282 +0000 UTC m=+117.631044761 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 11:58:32 crc kubenswrapper[4965]: E0318 11:58:32.145911 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9437442281bd48cf1ed1577080310b713dba0901a90f72916bce9c169219638b\": container with ID starting with 9437442281bd48cf1ed1577080310b713dba0901a90f72916bce9c169219638b not found: ID does not exist" containerID="9437442281bd48cf1ed1577080310b713dba0901a90f72916bce9c169219638b" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.146031 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9437442281bd48cf1ed1577080310b713dba0901a90f72916bce9c169219638b"} err="failed to get container status \"9437442281bd48cf1ed1577080310b713dba0901a90f72916bce9c169219638b\": rpc error: code = NotFound desc = could not find container \"9437442281bd48cf1ed1577080310b713dba0901a90f72916bce9c169219638b\": container with ID starting with 9437442281bd48cf1ed1577080310b713dba0901a90f72916bce9c169219638b not found: ID does not exist" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.146105 4965 scope.go:117] "RemoveContainer" containerID="f25e47fc582582ccdc26cb83b5e52fed75080a556d66c2e4f787bbc8084c7cca" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.156102 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xjqrf" event={"ID":"047424fe-da2d-4da3-a8b4-ff9bc5ae743f","Type":"ContainerStarted","Data":"0b0b5e0628cc3b6adc2c2b67a27e99c6ab2e7226e876f0374b3166992f68aa4c"} Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.156372 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xjqrf" event={"ID":"047424fe-da2d-4da3-a8b4-ff9bc5ae743f","Type":"ContainerStarted","Data":"c3d042976ee663303def42a21482dbec9fcf6ae3634483572f572197d99db4dd"} Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.157319 4965 patch_prober.go:28] interesting pod/downloads-7954f5f757-q9zxt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.157428 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-q9zxt" podUID="18b96e51-1a8b-4947-a9a3-9af0c0e88100" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.158181 4965 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8pdbx container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/healthz\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.158283 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8pdbx" podUID="beb408f9-638c-4bd9-b4ec-c72bc43286e3" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.20:8080/healthz\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.158368 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sw5p7"] Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.166065 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sw5p7"] Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.190366 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-768f74964f-q9448" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.191226 4965 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-18T11:58:31.979057914Z","Handler":null,"Name":""} Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.199810 4965 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.199843 4965 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.202604 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5x9mq"] Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.210419 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5x9mq"] Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.233396 4965 scope.go:117] "RemoveContainer" containerID="f25e47fc582582ccdc26cb83b5e52fed75080a556d66c2e4f787bbc8084c7cca" Mar 18 11:58:32 crc kubenswrapper[4965]: E0318 11:58:32.241891 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f25e47fc582582ccdc26cb83b5e52fed75080a556d66c2e4f787bbc8084c7cca\": container with ID starting with f25e47fc582582ccdc26cb83b5e52fed75080a556d66c2e4f787bbc8084c7cca not found: ID does not exist" containerID="f25e47fc582582ccdc26cb83b5e52fed75080a556d66c2e4f787bbc8084c7cca" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.241944 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f25e47fc582582ccdc26cb83b5e52fed75080a556d66c2e4f787bbc8084c7cca"} err="failed to get container status \"f25e47fc582582ccdc26cb83b5e52fed75080a556d66c2e4f787bbc8084c7cca\": rpc error: code = NotFound desc = could not find container \"f25e47fc582582ccdc26cb83b5e52fed75080a556d66c2e4f787bbc8084c7cca\": container with ID starting with f25e47fc582582ccdc26cb83b5e52fed75080a556d66c2e4f787bbc8084c7cca not found: ID does not exist" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.266377 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.292392 4965 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.292442 4965 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.354040 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmt4c\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.368226 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.377271 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.425981 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5wjlx"] Mar 18 11:58:32 crc kubenswrapper[4965]: E0318 11:58:32.426171 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c56d894-8d33-4682-b8ea-4d99fc3bb1b7" containerName="route-controller-manager" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.426182 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c56d894-8d33-4682-b8ea-4d99fc3bb1b7" containerName="route-controller-manager" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.426267 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c56d894-8d33-4682-b8ea-4d99fc3bb1b7" containerName="route-controller-manager" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.426930 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5wjlx" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.429111 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.442449 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5wjlx"] Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.469320 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5-catalog-content\") pod \"community-operators-5wjlx\" (UID: \"39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5\") " pod="openshift-marketplace/community-operators-5wjlx" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.469392 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5-utilities\") pod \"community-operators-5wjlx\" (UID: \"39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5\") " pod="openshift-marketplace/community-operators-5wjlx" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.469462 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-895zv\" (UniqueName: \"kubernetes.io/projected/39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5-kube-api-access-895zv\") pod \"community-operators-5wjlx\" (UID: \"39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5\") " pod="openshift-marketplace/community-operators-5wjlx" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.570324 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5-catalog-content\") pod \"community-operators-5wjlx\" (UID: \"39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5\") " pod="openshift-marketplace/community-operators-5wjlx" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.570403 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5-utilities\") pod \"community-operators-5wjlx\" (UID: \"39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5\") " pod="openshift-marketplace/community-operators-5wjlx" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.570471 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-895zv\" (UniqueName: \"kubernetes.io/projected/39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5-kube-api-access-895zv\") pod \"community-operators-5wjlx\" (UID: \"39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5\") " pod="openshift-marketplace/community-operators-5wjlx" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.572006 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5-utilities\") pod \"community-operators-5wjlx\" (UID: \"39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5\") " pod="openshift-marketplace/community-operators-5wjlx" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.572278 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5-catalog-content\") pod \"community-operators-5wjlx\" (UID: \"39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5\") " pod="openshift-marketplace/community-operators-5wjlx" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.588382 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-768f74964f-q9448"] Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.592543 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-895zv\" (UniqueName: \"kubernetes.io/projected/39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5-kube-api-access-895zv\") pod \"community-operators-5wjlx\" (UID: \"39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5\") " pod="openshift-marketplace/community-operators-5wjlx" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.607092 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.637836 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sjkft"] Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.639016 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sjkft" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.642720 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.644394 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sjkft"] Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.673845 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3614d69-f3b4-4496-af2f-d119c56de1c7-utilities\") pod \"certified-operators-sjkft\" (UID: \"c3614d69-f3b4-4496-af2f-d119c56de1c7\") " pod="openshift-marketplace/certified-operators-sjkft" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.673919 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3614d69-f3b4-4496-af2f-d119c56de1c7-catalog-content\") pod \"certified-operators-sjkft\" (UID: \"c3614d69-f3b4-4496-af2f-d119c56de1c7\") " pod="openshift-marketplace/certified-operators-sjkft" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.673957 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwnvg\" (UniqueName: \"kubernetes.io/projected/c3614d69-f3b4-4496-af2f-d119c56de1c7-kube-api-access-zwnvg\") pod \"certified-operators-sjkft\" (UID: \"c3614d69-f3b4-4496-af2f-d119c56de1c7\") " pod="openshift-marketplace/certified-operators-sjkft" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.725907 4965 patch_prober.go:28] interesting pod/router-default-5444994796-xhsv5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 11:58:32 crc kubenswrapper[4965]: [-]has-synced failed: reason withheld Mar 18 11:58:32 crc kubenswrapper[4965]: [+]process-running ok Mar 18 11:58:32 crc kubenswrapper[4965]: healthz check failed Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.725963 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhsv5" podUID="ea2b4f76-5f61-410a-bb32-c1185e630c7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.744481 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5wjlx" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.774647 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3614d69-f3b4-4496-af2f-d119c56de1c7-utilities\") pod \"certified-operators-sjkft\" (UID: \"c3614d69-f3b4-4496-af2f-d119c56de1c7\") " pod="openshift-marketplace/certified-operators-sjkft" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.774757 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3614d69-f3b4-4496-af2f-d119c56de1c7-catalog-content\") pod \"certified-operators-sjkft\" (UID: \"c3614d69-f3b4-4496-af2f-d119c56de1c7\") " pod="openshift-marketplace/certified-operators-sjkft" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.774817 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwnvg\" (UniqueName: \"kubernetes.io/projected/c3614d69-f3b4-4496-af2f-d119c56de1c7-kube-api-access-zwnvg\") pod \"certified-operators-sjkft\" (UID: \"c3614d69-f3b4-4496-af2f-d119c56de1c7\") " pod="openshift-marketplace/certified-operators-sjkft" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.776416 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3614d69-f3b4-4496-af2f-d119c56de1c7-utilities\") pod \"certified-operators-sjkft\" (UID: \"c3614d69-f3b4-4496-af2f-d119c56de1c7\") " pod="openshift-marketplace/certified-operators-sjkft" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.776691 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3614d69-f3b4-4496-af2f-d119c56de1c7-catalog-content\") pod \"certified-operators-sjkft\" (UID: \"c3614d69-f3b4-4496-af2f-d119c56de1c7\") " pod="openshift-marketplace/certified-operators-sjkft" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.796209 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwnvg\" (UniqueName: \"kubernetes.io/projected/c3614d69-f3b4-4496-af2f-d119c56de1c7-kube-api-access-zwnvg\") pod \"certified-operators-sjkft\" (UID: \"c3614d69-f3b4-4496-af2f-d119c56de1c7\") " pod="openshift-marketplace/certified-operators-sjkft" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.826630 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9btzt"] Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.827486 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9btzt" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.842651 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9btzt"] Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.878374 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11860b27-ab66-423c-9939-df595c023a38-utilities\") pod \"community-operators-9btzt\" (UID: \"11860b27-ab66-423c-9939-df595c023a38\") " pod="openshift-marketplace/community-operators-9btzt" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.878424 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11860b27-ab66-423c-9939-df595c023a38-catalog-content\") pod \"community-operators-9btzt\" (UID: \"11860b27-ab66-423c-9939-df595c023a38\") " pod="openshift-marketplace/community-operators-9btzt" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.878459 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6wdl\" (UniqueName: \"kubernetes.io/projected/11860b27-ab66-423c-9939-df595c023a38-kube-api-access-h6wdl\") pod \"community-operators-9btzt\" (UID: \"11860b27-ab66-423c-9939-df595c023a38\") " pod="openshift-marketplace/community-operators-9btzt" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.896178 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zmt4c"] Mar 18 11:58:32 crc kubenswrapper[4965]: W0318 11:58:32.906141 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad0ada2e_efc3_4a75_b954_15e8b893e2bf.slice/crio-27a6752d52e21dcbefacc51add472f8feec6d5863fc68e1e8c5aa9dfbccc2919 WatchSource:0}: Error finding container 27a6752d52e21dcbefacc51add472f8feec6d5863fc68e1e8c5aa9dfbccc2919: Status 404 returned error can't find the container with id 27a6752d52e21dcbefacc51add472f8feec6d5863fc68e1e8c5aa9dfbccc2919 Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.980049 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11860b27-ab66-423c-9939-df595c023a38-utilities\") pod \"community-operators-9btzt\" (UID: \"11860b27-ab66-423c-9939-df595c023a38\") " pod="openshift-marketplace/community-operators-9btzt" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.980369 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11860b27-ab66-423c-9939-df595c023a38-catalog-content\") pod \"community-operators-9btzt\" (UID: \"11860b27-ab66-423c-9939-df595c023a38\") " pod="openshift-marketplace/community-operators-9btzt" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.980406 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6wdl\" (UniqueName: \"kubernetes.io/projected/11860b27-ab66-423c-9939-df595c023a38-kube-api-access-h6wdl\") pod \"community-operators-9btzt\" (UID: \"11860b27-ab66-423c-9939-df595c023a38\") " pod="openshift-marketplace/community-operators-9btzt" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.981843 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11860b27-ab66-423c-9939-df595c023a38-utilities\") pod \"community-operators-9btzt\" (UID: \"11860b27-ab66-423c-9939-df595c023a38\") " pod="openshift-marketplace/community-operators-9btzt" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.981910 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11860b27-ab66-423c-9939-df595c023a38-catalog-content\") pod \"community-operators-9btzt\" (UID: \"11860b27-ab66-423c-9939-df595c023a38\") " pod="openshift-marketplace/community-operators-9btzt" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.982171 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sjkft" Mar 18 11:58:32 crc kubenswrapper[4965]: I0318 11:58:32.999813 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6wdl\" (UniqueName: \"kubernetes.io/projected/11860b27-ab66-423c-9939-df595c023a38-kube-api-access-h6wdl\") pod \"community-operators-9btzt\" (UID: \"11860b27-ab66-423c-9939-df595c023a38\") " pod="openshift-marketplace/community-operators-9btzt" Mar 18 11:58:33 crc kubenswrapper[4965]: I0318 11:58:33.003594 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5wjlx"] Mar 18 11:58:33 crc kubenswrapper[4965]: I0318 11:58:33.031489 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6pk8b"] Mar 18 11:58:33 crc kubenswrapper[4965]: I0318 11:58:33.032947 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6pk8b" Mar 18 11:58:33 crc kubenswrapper[4965]: I0318 11:58:33.035996 4965 ???:1] "http: TLS handshake error from 192.168.126.11:42542: no serving certificate available for the kubelet" Mar 18 11:58:33 crc kubenswrapper[4965]: I0318 11:58:33.043343 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 11:58:33 crc kubenswrapper[4965]: I0318 11:58:33.044629 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 11:58:33 crc kubenswrapper[4965]: I0318 11:58:33.102379 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 18 11:58:33 crc kubenswrapper[4965]: I0318 11:58:33.104084 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6pk8b"] Mar 18 11:58:33 crc kubenswrapper[4965]: I0318 11:58:33.111455 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 18 11:58:33 crc kubenswrapper[4965]: I0318 11:58:33.129701 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 11:58:33 crc kubenswrapper[4965]: I0318 11:58:33.148102 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9btzt" Mar 18 11:58:33 crc kubenswrapper[4965]: I0318 11:58:33.154856 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" event={"ID":"ad0ada2e-efc3-4a75-b954-15e8b893e2bf","Type":"ContainerStarted","Data":"27a6752d52e21dcbefacc51add472f8feec6d5863fc68e1e8c5aa9dfbccc2919"} Mar 18 11:58:33 crc kubenswrapper[4965]: I0318 11:58:33.170216 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xjqrf" event={"ID":"047424fe-da2d-4da3-a8b4-ff9bc5ae743f","Type":"ContainerStarted","Data":"615da62844084da4a059797374c8be2993561485fd0cd01c7859e3582d57fad2"} Mar 18 11:58:33 crc kubenswrapper[4965]: I0318 11:58:33.197849 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-768f74964f-q9448" event={"ID":"41c127cf-02c5-45cd-bf65-5015dd07bd7f","Type":"ContainerStarted","Data":"234b60652eb06cf5c71cd97027084cab71bc15881eb956b6ffa36efe073190b3"} Mar 18 11:58:33 crc kubenswrapper[4965]: I0318 11:58:33.197895 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-768f74964f-q9448" event={"ID":"41c127cf-02c5-45cd-bf65-5015dd07bd7f","Type":"ContainerStarted","Data":"2be71dd524371e104a3651febd59472497b678a257d576c45bdfb459f2752252"} Mar 18 11:58:33 crc kubenswrapper[4965]: I0318 11:58:33.198509 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-768f74964f-q9448" Mar 18 11:58:33 crc kubenswrapper[4965]: I0318 11:58:33.199919 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5wjlx" event={"ID":"39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5","Type":"ContainerStarted","Data":"8a0202d4beb33ca70efb1d561f22b4fdb3c95166da0d747508231f1ae42518fa"} Mar 18 11:58:33 crc kubenswrapper[4965]: I0318 11:58:33.202093 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4329347a-487b-4012-a715-9565bc4e67d0-utilities\") pod \"certified-operators-6pk8b\" (UID: \"4329347a-487b-4012-a715-9565bc4e67d0\") " pod="openshift-marketplace/certified-operators-6pk8b" Mar 18 11:58:33 crc kubenswrapper[4965]: I0318 11:58:33.202198 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4329347a-487b-4012-a715-9565bc4e67d0-catalog-content\") pod \"certified-operators-6pk8b\" (UID: \"4329347a-487b-4012-a715-9565bc4e67d0\") " pod="openshift-marketplace/certified-operators-6pk8b" Mar 18 11:58:33 crc kubenswrapper[4965]: I0318 11:58:33.202223 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfc57\" (UniqueName: \"kubernetes.io/projected/4329347a-487b-4012-a715-9565bc4e67d0-kube-api-access-tfc57\") pod \"certified-operators-6pk8b\" (UID: \"4329347a-487b-4012-a715-9565bc4e67d0\") " pod="openshift-marketplace/certified-operators-6pk8b" Mar 18 11:58:33 crc kubenswrapper[4965]: I0318 11:58:33.202274 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/04982060-6c74-4de3-a270-3b3b59bb00e7-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"04982060-6c74-4de3-a270-3b3b59bb00e7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 11:58:33 crc kubenswrapper[4965]: I0318 11:58:33.202293 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/04982060-6c74-4de3-a270-3b3b59bb00e7-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"04982060-6c74-4de3-a270-3b3b59bb00e7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 11:58:33 crc kubenswrapper[4965]: I0318 11:58:33.233568 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-768f74964f-q9448" Mar 18 11:58:33 crc kubenswrapper[4965]: I0318 11:58:33.260759 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-xjqrf" podStartSLOduration=10.260722359 podStartE2EDuration="10.260722359s" podCreationTimestamp="2026-03-18 11:58:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:33.208911221 +0000 UTC m=+118.195098700" watchObservedRunningTime="2026-03-18 11:58:33.260722359 +0000 UTC m=+118.246909838" Mar 18 11:58:33 crc kubenswrapper[4965]: I0318 11:58:33.261982 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-768f74964f-q9448" podStartSLOduration=4.261976633 podStartE2EDuration="4.261976633s" podCreationTimestamp="2026-03-18 11:58:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:33.255023172 +0000 UTC m=+118.241210671" watchObservedRunningTime="2026-03-18 11:58:33.261976633 +0000 UTC m=+118.248164102" Mar 18 11:58:33 crc kubenswrapper[4965]: I0318 11:58:33.309753 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4329347a-487b-4012-a715-9565bc4e67d0-catalog-content\") pod \"certified-operators-6pk8b\" (UID: \"4329347a-487b-4012-a715-9565bc4e67d0\") " pod="openshift-marketplace/certified-operators-6pk8b" Mar 18 11:58:33 crc kubenswrapper[4965]: I0318 11:58:33.309787 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfc57\" (UniqueName: \"kubernetes.io/projected/4329347a-487b-4012-a715-9565bc4e67d0-kube-api-access-tfc57\") pod \"certified-operators-6pk8b\" (UID: \"4329347a-487b-4012-a715-9565bc4e67d0\") " pod="openshift-marketplace/certified-operators-6pk8b" Mar 18 11:58:33 crc kubenswrapper[4965]: I0318 11:58:33.309843 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/04982060-6c74-4de3-a270-3b3b59bb00e7-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"04982060-6c74-4de3-a270-3b3b59bb00e7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 11:58:33 crc kubenswrapper[4965]: I0318 11:58:33.309859 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/04982060-6c74-4de3-a270-3b3b59bb00e7-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"04982060-6c74-4de3-a270-3b3b59bb00e7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 11:58:33 crc kubenswrapper[4965]: I0318 11:58:33.309919 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4329347a-487b-4012-a715-9565bc4e67d0-utilities\") pod \"certified-operators-6pk8b\" (UID: \"4329347a-487b-4012-a715-9565bc4e67d0\") " pod="openshift-marketplace/certified-operators-6pk8b" Mar 18 11:58:33 crc kubenswrapper[4965]: I0318 11:58:33.310799 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/04982060-6c74-4de3-a270-3b3b59bb00e7-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"04982060-6c74-4de3-a270-3b3b59bb00e7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 11:58:33 crc kubenswrapper[4965]: I0318 11:58:33.311726 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4329347a-487b-4012-a715-9565bc4e67d0-utilities\") pod \"certified-operators-6pk8b\" (UID: \"4329347a-487b-4012-a715-9565bc4e67d0\") " pod="openshift-marketplace/certified-operators-6pk8b" Mar 18 11:58:33 crc kubenswrapper[4965]: I0318 11:58:33.318880 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4329347a-487b-4012-a715-9565bc4e67d0-catalog-content\") pod \"certified-operators-6pk8b\" (UID: \"4329347a-487b-4012-a715-9565bc4e67d0\") " pod="openshift-marketplace/certified-operators-6pk8b" Mar 18 11:58:33 crc kubenswrapper[4965]: I0318 11:58:33.350146 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfc57\" (UniqueName: \"kubernetes.io/projected/4329347a-487b-4012-a715-9565bc4e67d0-kube-api-access-tfc57\") pod \"certified-operators-6pk8b\" (UID: \"4329347a-487b-4012-a715-9565bc4e67d0\") " pod="openshift-marketplace/certified-operators-6pk8b" Mar 18 11:58:33 crc kubenswrapper[4965]: I0318 11:58:33.379606 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/04982060-6c74-4de3-a270-3b3b59bb00e7-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"04982060-6c74-4de3-a270-3b3b59bb00e7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 11:58:33 crc kubenswrapper[4965]: I0318 11:58:33.442538 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sjkft"] Mar 18 11:58:33 crc kubenswrapper[4965]: I0318 11:58:33.448067 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6pk8b" Mar 18 11:58:33 crc kubenswrapper[4965]: I0318 11:58:33.490228 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 11:58:33 crc kubenswrapper[4965]: I0318 11:58:33.572864 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9btzt"] Mar 18 11:58:33 crc kubenswrapper[4965]: W0318 11:58:33.613842 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11860b27_ab66_423c_9939_df595c023a38.slice/crio-1f708c444020d7f9fbc4a4abea9811bd3a12dfd982ad027969b7f6f7782e477c WatchSource:0}: Error finding container 1f708c444020d7f9fbc4a4abea9811bd3a12dfd982ad027969b7f6f7782e477c: Status 404 returned error can't find the container with id 1f708c444020d7f9fbc4a4abea9811bd3a12dfd982ad027969b7f6f7782e477c Mar 18 11:58:33 crc kubenswrapper[4965]: I0318 11:58:33.741019 4965 patch_prober.go:28] interesting pod/router-default-5444994796-xhsv5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 11:58:33 crc kubenswrapper[4965]: [-]has-synced failed: reason withheld Mar 18 11:58:33 crc kubenswrapper[4965]: [+]process-running ok Mar 18 11:58:33 crc kubenswrapper[4965]: healthz check failed Mar 18 11:58:33 crc kubenswrapper[4965]: I0318 11:58:33.741092 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhsv5" podUID="ea2b4f76-5f61-410a-bb32-c1185e630c7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 11:58:33 crc kubenswrapper[4965]: I0318 11:58:33.902304 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6pk8b"] Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.027534 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29015bb8-e604-425d-a88b-db3ec9e10096" path="/var/lib/kubelet/pods/29015bb8-e604-425d-a88b-db3ec9e10096/volumes" Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.028430 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c56d894-8d33-4682-b8ea-4d99fc3bb1b7" path="/var/lib/kubelet/pods/2c56d894-8d33-4682-b8ea-4d99fc3bb1b7/volumes" Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.029185 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.153857 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 11:58:34 crc kubenswrapper[4965]: W0318 11:58:34.174561 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod04982060_6c74_4de3_a270_3b3b59bb00e7.slice/crio-f6395e84cfbab9983b7b496eb53d7e17ab96b58a7d1c6f2823c1f91cc138b3de WatchSource:0}: Error finding container f6395e84cfbab9983b7b496eb53d7e17ab96b58a7d1c6f2823c1f91cc138b3de: Status 404 returned error can't find the container with id f6395e84cfbab9983b7b496eb53d7e17ab96b58a7d1c6f2823c1f91cc138b3de Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.234879 4965 generic.go:334] "Generic (PLEG): container finished" podID="39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5" containerID="867db37fa20fe9d31eb0ed1ff6142f3377c9dbca39c9b2cbb04c84eaf5354a64" exitCode=0 Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.235265 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5wjlx" event={"ID":"39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5","Type":"ContainerDied","Data":"867db37fa20fe9d31eb0ed1ff6142f3377c9dbca39c9b2cbb04c84eaf5354a64"} Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.240066 4965 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.246216 4965 generic.go:334] "Generic (PLEG): container finished" podID="4329347a-487b-4012-a715-9565bc4e67d0" containerID="2ef6855b2bdc0363f98390332d696682272c51322e7a695dbd3a452f00e4e09b" exitCode=0 Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.246325 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6pk8b" event={"ID":"4329347a-487b-4012-a715-9565bc4e67d0","Type":"ContainerDied","Data":"2ef6855b2bdc0363f98390332d696682272c51322e7a695dbd3a452f00e4e09b"} Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.246356 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6pk8b" event={"ID":"4329347a-487b-4012-a715-9565bc4e67d0","Type":"ContainerStarted","Data":"2f144bfbd2b49c1599ec7dd0ad239255137becc433f7c5447678e433e9fcbab6"} Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.267207 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-774bccfc87-tv2nq"] Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.268439 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-774bccfc87-tv2nq" Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.285982 4965 generic.go:334] "Generic (PLEG): container finished" podID="c3614d69-f3b4-4496-af2f-d119c56de1c7" containerID="e9b8adedeb83db4fa95df036cfd796ad623de694f22bf67c6298b30e9f6e2025" exitCode=0 Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.286097 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjkft" event={"ID":"c3614d69-f3b4-4496-af2f-d119c56de1c7","Type":"ContainerDied","Data":"e9b8adedeb83db4fa95df036cfd796ad623de694f22bf67c6298b30e9f6e2025"} Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.286138 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-774bccfc87-tv2nq"] Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.286160 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjkft" event={"ID":"c3614d69-f3b4-4496-af2f-d119c56de1c7","Type":"ContainerStarted","Data":"e75d438cda8551ccf31309f21ca2803ea5494c32f2b41b0f1792d9ff2805379b"} Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.289244 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.289470 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.289610 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.290239 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.290665 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.296146 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.349886 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" event={"ID":"ad0ada2e-efc3-4a75-b954-15e8b893e2bf","Type":"ContainerStarted","Data":"dea0017c6cd6eee4415187b0061d85592cca8cc67dc8ab41b96699adb9d7917b"} Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.350465 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.415318 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"04982060-6c74-4de3-a270-3b3b59bb00e7","Type":"ContainerStarted","Data":"f6395e84cfbab9983b7b496eb53d7e17ab96b58a7d1c6f2823c1f91cc138b3de"} Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.421876 4965 generic.go:334] "Generic (PLEG): container finished" podID="11860b27-ab66-423c-9939-df595c023a38" containerID="684ca9c8a152ea91a14aa7f18c1bc9c7c7190a48a5a6507329432f8b03064b25" exitCode=0 Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.422944 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9btzt" event={"ID":"11860b27-ab66-423c-9939-df595c023a38","Type":"ContainerDied","Data":"684ca9c8a152ea91a14aa7f18c1bc9c7c7190a48a5a6507329432f8b03064b25"} Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.423002 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9btzt" event={"ID":"11860b27-ab66-423c-9939-df595c023a38","Type":"ContainerStarted","Data":"1f708c444020d7f9fbc4a4abea9811bd3a12dfd982ad027969b7f6f7782e477c"} Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.437229 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k66lh\" (UniqueName: \"kubernetes.io/projected/d3962338-1df0-4e65-a1cf-d1860a48a18c-kube-api-access-k66lh\") pod \"route-controller-manager-774bccfc87-tv2nq\" (UID: \"d3962338-1df0-4e65-a1cf-d1860a48a18c\") " pod="openshift-route-controller-manager/route-controller-manager-774bccfc87-tv2nq" Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.437281 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3962338-1df0-4e65-a1cf-d1860a48a18c-serving-cert\") pod \"route-controller-manager-774bccfc87-tv2nq\" (UID: \"d3962338-1df0-4e65-a1cf-d1860a48a18c\") " pod="openshift-route-controller-manager/route-controller-manager-774bccfc87-tv2nq" Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.437377 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d3962338-1df0-4e65-a1cf-d1860a48a18c-client-ca\") pod \"route-controller-manager-774bccfc87-tv2nq\" (UID: \"d3962338-1df0-4e65-a1cf-d1860a48a18c\") " pod="openshift-route-controller-manager/route-controller-manager-774bccfc87-tv2nq" Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.437396 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3962338-1df0-4e65-a1cf-d1860a48a18c-config\") pod \"route-controller-manager-774bccfc87-tv2nq\" (UID: \"d3962338-1df0-4e65-a1cf-d1860a48a18c\") " pod="openshift-route-controller-manager/route-controller-manager-774bccfc87-tv2nq" Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.495652 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" podStartSLOduration=58.495632264 podStartE2EDuration="58.495632264s" podCreationTimestamp="2026-03-18 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:34.494302137 +0000 UTC m=+119.480489626" watchObservedRunningTime="2026-03-18 11:58:34.495632264 +0000 UTC m=+119.481819743" Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.539095 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3962338-1df0-4e65-a1cf-d1860a48a18c-serving-cert\") pod \"route-controller-manager-774bccfc87-tv2nq\" (UID: \"d3962338-1df0-4e65-a1cf-d1860a48a18c\") " pod="openshift-route-controller-manager/route-controller-manager-774bccfc87-tv2nq" Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.539969 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k66lh\" (UniqueName: \"kubernetes.io/projected/d3962338-1df0-4e65-a1cf-d1860a48a18c-kube-api-access-k66lh\") pod \"route-controller-manager-774bccfc87-tv2nq\" (UID: \"d3962338-1df0-4e65-a1cf-d1860a48a18c\") " pod="openshift-route-controller-manager/route-controller-manager-774bccfc87-tv2nq" Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.541310 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d3962338-1df0-4e65-a1cf-d1860a48a18c-client-ca\") pod \"route-controller-manager-774bccfc87-tv2nq\" (UID: \"d3962338-1df0-4e65-a1cf-d1860a48a18c\") " pod="openshift-route-controller-manager/route-controller-manager-774bccfc87-tv2nq" Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.541444 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3962338-1df0-4e65-a1cf-d1860a48a18c-config\") pod \"route-controller-manager-774bccfc87-tv2nq\" (UID: \"d3962338-1df0-4e65-a1cf-d1860a48a18c\") " pod="openshift-route-controller-manager/route-controller-manager-774bccfc87-tv2nq" Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.542578 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3962338-1df0-4e65-a1cf-d1860a48a18c-config\") pod \"route-controller-manager-774bccfc87-tv2nq\" (UID: \"d3962338-1df0-4e65-a1cf-d1860a48a18c\") " pod="openshift-route-controller-manager/route-controller-manager-774bccfc87-tv2nq" Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.544405 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d3962338-1df0-4e65-a1cf-d1860a48a18c-client-ca\") pod \"route-controller-manager-774bccfc87-tv2nq\" (UID: \"d3962338-1df0-4e65-a1cf-d1860a48a18c\") " pod="openshift-route-controller-manager/route-controller-manager-774bccfc87-tv2nq" Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.609440 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3962338-1df0-4e65-a1cf-d1860a48a18c-serving-cert\") pod \"route-controller-manager-774bccfc87-tv2nq\" (UID: \"d3962338-1df0-4e65-a1cf-d1860a48a18c\") " pod="openshift-route-controller-manager/route-controller-manager-774bccfc87-tv2nq" Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.610942 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k66lh\" (UniqueName: \"kubernetes.io/projected/d3962338-1df0-4e65-a1cf-d1860a48a18c-kube-api-access-k66lh\") pod \"route-controller-manager-774bccfc87-tv2nq\" (UID: \"d3962338-1df0-4e65-a1cf-d1860a48a18c\") " pod="openshift-route-controller-manager/route-controller-manager-774bccfc87-tv2nq" Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.626111 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gwpmg"] Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.627437 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gwpmg" Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.629774 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.649666 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gwpmg"] Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.653225 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-774bccfc87-tv2nq" Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.736958 4965 patch_prober.go:28] interesting pod/router-default-5444994796-xhsv5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 11:58:34 crc kubenswrapper[4965]: [-]has-synced failed: reason withheld Mar 18 11:58:34 crc kubenswrapper[4965]: [+]process-running ok Mar 18 11:58:34 crc kubenswrapper[4965]: healthz check failed Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.737748 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhsv5" podUID="ea2b4f76-5f61-410a-bb32-c1185e630c7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.743752 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f76b91e1-f768-4cb1-857d-6e3eb31e59f6-catalog-content\") pod \"redhat-marketplace-gwpmg\" (UID: \"f76b91e1-f768-4cb1-857d-6e3eb31e59f6\") " pod="openshift-marketplace/redhat-marketplace-gwpmg" Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.743824 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pww9d\" (UniqueName: \"kubernetes.io/projected/f76b91e1-f768-4cb1-857d-6e3eb31e59f6-kube-api-access-pww9d\") pod \"redhat-marketplace-gwpmg\" (UID: \"f76b91e1-f768-4cb1-857d-6e3eb31e59f6\") " pod="openshift-marketplace/redhat-marketplace-gwpmg" Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.744058 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f76b91e1-f768-4cb1-857d-6e3eb31e59f6-utilities\") pod \"redhat-marketplace-gwpmg\" (UID: \"f76b91e1-f768-4cb1-857d-6e3eb31e59f6\") " pod="openshift-marketplace/redhat-marketplace-gwpmg" Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.798453 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-pj8km" Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.798502 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-pj8km" Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.806805 4965 patch_prober.go:28] interesting pod/console-f9d7485db-pj8km container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.6:8443/health\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.806858 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-pj8km" podUID="bf0cad60-78bb-4325-aaaa-ee2636410fcb" containerName="console" probeResult="failure" output="Get \"https://10.217.0.6:8443/health\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.852203 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pww9d\" (UniqueName: \"kubernetes.io/projected/f76b91e1-f768-4cb1-857d-6e3eb31e59f6-kube-api-access-pww9d\") pod \"redhat-marketplace-gwpmg\" (UID: \"f76b91e1-f768-4cb1-857d-6e3eb31e59f6\") " pod="openshift-marketplace/redhat-marketplace-gwpmg" Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.852495 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f76b91e1-f768-4cb1-857d-6e3eb31e59f6-utilities\") pod \"redhat-marketplace-gwpmg\" (UID: \"f76b91e1-f768-4cb1-857d-6e3eb31e59f6\") " pod="openshift-marketplace/redhat-marketplace-gwpmg" Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.852568 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f76b91e1-f768-4cb1-857d-6e3eb31e59f6-catalog-content\") pod \"redhat-marketplace-gwpmg\" (UID: \"f76b91e1-f768-4cb1-857d-6e3eb31e59f6\") " pod="openshift-marketplace/redhat-marketplace-gwpmg" Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.852985 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f76b91e1-f768-4cb1-857d-6e3eb31e59f6-catalog-content\") pod \"redhat-marketplace-gwpmg\" (UID: \"f76b91e1-f768-4cb1-857d-6e3eb31e59f6\") " pod="openshift-marketplace/redhat-marketplace-gwpmg" Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.853222 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f76b91e1-f768-4cb1-857d-6e3eb31e59f6-utilities\") pod \"redhat-marketplace-gwpmg\" (UID: \"f76b91e1-f768-4cb1-857d-6e3eb31e59f6\") " pod="openshift-marketplace/redhat-marketplace-gwpmg" Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.876127 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pww9d\" (UniqueName: \"kubernetes.io/projected/f76b91e1-f768-4cb1-857d-6e3eb31e59f6-kube-api-access-pww9d\") pod \"redhat-marketplace-gwpmg\" (UID: \"f76b91e1-f768-4cb1-857d-6e3eb31e59f6\") " pod="openshift-marketplace/redhat-marketplace-gwpmg" Mar 18 11:58:34 crc kubenswrapper[4965]: I0318 11:58:34.943895 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-774bccfc87-tv2nq"] Mar 18 11:58:34 crc kubenswrapper[4965]: W0318 11:58:34.958704 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3962338_1df0_4e65_a1cf_d1860a48a18c.slice/crio-0ea105214428f812f830fc40b869c1f8f6ba81d3d9bc75902374afb7f29bb1cb WatchSource:0}: Error finding container 0ea105214428f812f830fc40b869c1f8f6ba81d3d9bc75902374afb7f29bb1cb: Status 404 returned error can't find the container with id 0ea105214428f812f830fc40b869c1f8f6ba81d3d9bc75902374afb7f29bb1cb Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.008918 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gwpmg" Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.021752 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.036409 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rkz24"] Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.037386 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rkz24" Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.066020 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rkz24"] Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.155628 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a889591-6c1d-4940-9eb2-28cfa6988f89-utilities\") pod \"redhat-marketplace-rkz24\" (UID: \"7a889591-6c1d-4940-9eb2-28cfa6988f89\") " pod="openshift-marketplace/redhat-marketplace-rkz24" Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.155730 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a889591-6c1d-4940-9eb2-28cfa6988f89-catalog-content\") pod \"redhat-marketplace-rkz24\" (UID: \"7a889591-6c1d-4940-9eb2-28cfa6988f89\") " pod="openshift-marketplace/redhat-marketplace-rkz24" Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.155771 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvjm4\" (UniqueName: \"kubernetes.io/projected/7a889591-6c1d-4940-9eb2-28cfa6988f89-kube-api-access-pvjm4\") pod \"redhat-marketplace-rkz24\" (UID: \"7a889591-6c1d-4940-9eb2-28cfa6988f89\") " pod="openshift-marketplace/redhat-marketplace-rkz24" Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.256772 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a889591-6c1d-4940-9eb2-28cfa6988f89-utilities\") pod \"redhat-marketplace-rkz24\" (UID: \"7a889591-6c1d-4940-9eb2-28cfa6988f89\") " pod="openshift-marketplace/redhat-marketplace-rkz24" Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.256832 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a889591-6c1d-4940-9eb2-28cfa6988f89-catalog-content\") pod \"redhat-marketplace-rkz24\" (UID: \"7a889591-6c1d-4940-9eb2-28cfa6988f89\") " pod="openshift-marketplace/redhat-marketplace-rkz24" Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.256874 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvjm4\" (UniqueName: \"kubernetes.io/projected/7a889591-6c1d-4940-9eb2-28cfa6988f89-kube-api-access-pvjm4\") pod \"redhat-marketplace-rkz24\" (UID: \"7a889591-6c1d-4940-9eb2-28cfa6988f89\") " pod="openshift-marketplace/redhat-marketplace-rkz24" Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.257686 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a889591-6c1d-4940-9eb2-28cfa6988f89-utilities\") pod \"redhat-marketplace-rkz24\" (UID: \"7a889591-6c1d-4940-9eb2-28cfa6988f89\") " pod="openshift-marketplace/redhat-marketplace-rkz24" Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.257740 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a889591-6c1d-4940-9eb2-28cfa6988f89-catalog-content\") pod \"redhat-marketplace-rkz24\" (UID: \"7a889591-6c1d-4940-9eb2-28cfa6988f89\") " pod="openshift-marketplace/redhat-marketplace-rkz24" Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.301967 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvjm4\" (UniqueName: \"kubernetes.io/projected/7a889591-6c1d-4940-9eb2-28cfa6988f89-kube-api-access-pvjm4\") pod \"redhat-marketplace-rkz24\" (UID: \"7a889591-6c1d-4940-9eb2-28cfa6988f89\") " pod="openshift-marketplace/redhat-marketplace-rkz24" Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.394828 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rkz24" Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.476891 4965 generic.go:334] "Generic (PLEG): container finished" podID="2aaf7d7a-c623-4198-b68e-7efa895cb96f" containerID="ca308b01598a8ea855134f9818c3d3e759c37ff50919fd540d1d6e550e2f213c" exitCode=0 Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.476983 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563905-l6rnm" event={"ID":"2aaf7d7a-c623-4198-b68e-7efa895cb96f","Type":"ContainerDied","Data":"ca308b01598a8ea855134f9818c3d3e759c37ff50919fd540d1d6e550e2f213c"} Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.482200 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gwpmg"] Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.484695 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-774bccfc87-tv2nq" event={"ID":"d3962338-1df0-4e65-a1cf-d1860a48a18c","Type":"ContainerStarted","Data":"138559671c5a520589e780f12dae9c01c06c8b519221970cf7ac3c7e4e6df452"} Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.484746 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-774bccfc87-tv2nq" event={"ID":"d3962338-1df0-4e65-a1cf-d1860a48a18c","Type":"ContainerStarted","Data":"0ea105214428f812f830fc40b869c1f8f6ba81d3d9bc75902374afb7f29bb1cb"} Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.485545 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-774bccfc87-tv2nq" Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.502209 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qdbck" Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.517446 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qdbck" Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.528302 4965 generic.go:334] "Generic (PLEG): container finished" podID="04982060-6c74-4de3-a270-3b3b59bb00e7" containerID="1a6ee453571168a07452ceee3f91f1155d0628a22a68a49e4304d58aecd1bf3f" exitCode=0 Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.528729 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"04982060-6c74-4de3-a270-3b3b59bb00e7","Type":"ContainerDied","Data":"1a6ee453571168a07452ceee3f91f1155d0628a22a68a49e4304d58aecd1bf3f"} Mar 18 11:58:35 crc kubenswrapper[4965]: W0318 11:58:35.529862 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf76b91e1_f768_4cb1_857d_6e3eb31e59f6.slice/crio-f5240c39614a6d1944304139b359ff5c416d3939337b76bbf246b2e767b8d847 WatchSource:0}: Error finding container f5240c39614a6d1944304139b359ff5c416d3939337b76bbf246b2e767b8d847: Status 404 returned error can't find the container with id f5240c39614a6d1944304139b359ff5c416d3939337b76bbf246b2e767b8d847 Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.584741 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-mmzsn" Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.584791 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-mmzsn" Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.589392 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-774bccfc87-tv2nq" podStartSLOduration=6.58937327 podStartE2EDuration="6.58937327s" podCreationTimestamp="2026-03-18 11:58:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:35.576715971 +0000 UTC m=+120.562903470" watchObservedRunningTime="2026-03-18 11:58:35.58937327 +0000 UTC m=+120.575560739" Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.604574 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-mmzsn" Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.662677 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zx7g7"] Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.664042 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zx7g7" Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.669909 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.677526 4965 ???:1] "http: TLS handshake error from 192.168.126.11:42548: no serving certificate available for the kubelet" Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.680797 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zx7g7"] Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.703549 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.719749 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.728437 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-xhsv5" Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.728891 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.729168 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.729225 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.734951 4965 patch_prober.go:28] interesting pod/router-default-5444994796-xhsv5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 11:58:35 crc kubenswrapper[4965]: [-]has-synced failed: reason withheld Mar 18 11:58:35 crc kubenswrapper[4965]: [+]process-running ok Mar 18 11:58:35 crc kubenswrapper[4965]: healthz check failed Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.735021 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhsv5" podUID="ea2b4f76-5f61-410a-bb32-c1185e630c7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.736741 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-774bccfc87-tv2nq" Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.782938 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxtzh\" (UniqueName: \"kubernetes.io/projected/8edec177-0701-4d1c-bb61-33c0a05df51d-kube-api-access-bxtzh\") pod \"redhat-operators-zx7g7\" (UID: \"8edec177-0701-4d1c-bb61-33c0a05df51d\") " pod="openshift-marketplace/redhat-operators-zx7g7" Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.783070 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8edec177-0701-4d1c-bb61-33c0a05df51d-catalog-content\") pod \"redhat-operators-zx7g7\" (UID: \"8edec177-0701-4d1c-bb61-33c0a05df51d\") " pod="openshift-marketplace/redhat-operators-zx7g7" Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.783139 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8edec177-0701-4d1c-bb61-33c0a05df51d-utilities\") pod \"redhat-operators-zx7g7\" (UID: \"8edec177-0701-4d1c-bb61-33c0a05df51d\") " pod="openshift-marketplace/redhat-operators-zx7g7" Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.885475 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e50e5ba5-1542-4bdb-9aaa-adff0388ce10-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e50e5ba5-1542-4bdb-9aaa-adff0388ce10\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.885984 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxtzh\" (UniqueName: \"kubernetes.io/projected/8edec177-0701-4d1c-bb61-33c0a05df51d-kube-api-access-bxtzh\") pod \"redhat-operators-zx7g7\" (UID: \"8edec177-0701-4d1c-bb61-33c0a05df51d\") " pod="openshift-marketplace/redhat-operators-zx7g7" Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.886080 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8edec177-0701-4d1c-bb61-33c0a05df51d-catalog-content\") pod \"redhat-operators-zx7g7\" (UID: \"8edec177-0701-4d1c-bb61-33c0a05df51d\") " pod="openshift-marketplace/redhat-operators-zx7g7" Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.886149 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e50e5ba5-1542-4bdb-9aaa-adff0388ce10-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e50e5ba5-1542-4bdb-9aaa-adff0388ce10\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.886173 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8edec177-0701-4d1c-bb61-33c0a05df51d-utilities\") pod \"redhat-operators-zx7g7\" (UID: \"8edec177-0701-4d1c-bb61-33c0a05df51d\") " pod="openshift-marketplace/redhat-operators-zx7g7" Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.888370 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8edec177-0701-4d1c-bb61-33c0a05df51d-utilities\") pod \"redhat-operators-zx7g7\" (UID: \"8edec177-0701-4d1c-bb61-33c0a05df51d\") " pod="openshift-marketplace/redhat-operators-zx7g7" Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.888519 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8edec177-0701-4d1c-bb61-33c0a05df51d-catalog-content\") pod \"redhat-operators-zx7g7\" (UID: \"8edec177-0701-4d1c-bb61-33c0a05df51d\") " pod="openshift-marketplace/redhat-operators-zx7g7" Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.928801 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxtzh\" (UniqueName: \"kubernetes.io/projected/8edec177-0701-4d1c-bb61-33c0a05df51d-kube-api-access-bxtzh\") pod \"redhat-operators-zx7g7\" (UID: \"8edec177-0701-4d1c-bb61-33c0a05df51d\") " pod="openshift-marketplace/redhat-operators-zx7g7" Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.989778 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e50e5ba5-1542-4bdb-9aaa-adff0388ce10-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e50e5ba5-1542-4bdb-9aaa-adff0388ce10\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.989842 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e50e5ba5-1542-4bdb-9aaa-adff0388ce10-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e50e5ba5-1542-4bdb-9aaa-adff0388ce10\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 11:58:35 crc kubenswrapper[4965]: I0318 11:58:35.990183 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e50e5ba5-1542-4bdb-9aaa-adff0388ce10-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e50e5ba5-1542-4bdb-9aaa-adff0388ce10\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 11:58:36 crc kubenswrapper[4965]: I0318 11:58:36.009770 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 18 11:58:36 crc kubenswrapper[4965]: I0318 11:58:36.048833 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bq6m2"] Mar 18 11:58:36 crc kubenswrapper[4965]: I0318 11:58:36.069669 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bq6m2"] Mar 18 11:58:36 crc kubenswrapper[4965]: I0318 11:58:36.054010 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e50e5ba5-1542-4bdb-9aaa-adff0388ce10-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e50e5ba5-1542-4bdb-9aaa-adff0388ce10\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 11:58:36 crc kubenswrapper[4965]: I0318 11:58:36.058279 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zx7g7" Mar 18 11:58:36 crc kubenswrapper[4965]: I0318 11:58:36.049747 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 11:58:36 crc kubenswrapper[4965]: I0318 11:58:36.071156 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bq6m2" Mar 18 11:58:36 crc kubenswrapper[4965]: I0318 11:58:36.097321 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 18 11:58:36 crc kubenswrapper[4965]: I0318 11:58:36.103111 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 11:58:36 crc kubenswrapper[4965]: I0318 11:58:36.103934 4965 patch_prober.go:28] interesting pod/downloads-7954f5f757-q9zxt container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 18 11:58:36 crc kubenswrapper[4965]: I0318 11:58:36.103962 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-q9zxt" podUID="18b96e51-1a8b-4947-a9a3-9af0c0e88100" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 18 11:58:36 crc kubenswrapper[4965]: I0318 11:58:36.104238 4965 patch_prober.go:28] interesting pod/downloads-7954f5f757-q9zxt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 18 11:58:36 crc kubenswrapper[4965]: I0318 11:58:36.104311 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-q9zxt" podUID="18b96e51-1a8b-4947-a9a3-9af0c0e88100" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 18 11:58:36 crc kubenswrapper[4965]: E0318 11:58:36.172532 4965 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3f72bea0b0230bd5d6e3e34312b81820002f622700a1b0b20f09e43e4a37e2ad" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 11:58:36 crc kubenswrapper[4965]: E0318 11:58:36.175718 4965 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3f72bea0b0230bd5d6e3e34312b81820002f622700a1b0b20f09e43e4a37e2ad" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 11:58:36 crc kubenswrapper[4965]: E0318 11:58:36.180458 4965 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3f72bea0b0230bd5d6e3e34312b81820002f622700a1b0b20f09e43e4a37e2ad" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 11:58:36 crc kubenswrapper[4965]: E0318 11:58:36.180543 4965 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-5scj2" podUID="ccc8d88d-863a-43b2-916d-b9df5a38453d" containerName="kube-multus-additional-cni-plugins" Mar 18 11:58:36 crc kubenswrapper[4965]: I0318 11:58:36.192738 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c7d53d7-b4fc-48d5-b8f4-545ea972697e-catalog-content\") pod \"redhat-operators-bq6m2\" (UID: \"6c7d53d7-b4fc-48d5-b8f4-545ea972697e\") " pod="openshift-marketplace/redhat-operators-bq6m2" Mar 18 11:58:36 crc kubenswrapper[4965]: I0318 11:58:36.192821 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx2l2\" (UniqueName: \"kubernetes.io/projected/6c7d53d7-b4fc-48d5-b8f4-545ea972697e-kube-api-access-hx2l2\") pod \"redhat-operators-bq6m2\" (UID: \"6c7d53d7-b4fc-48d5-b8f4-545ea972697e\") " pod="openshift-marketplace/redhat-operators-bq6m2" Mar 18 11:58:36 crc kubenswrapper[4965]: I0318 11:58:36.192920 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c7d53d7-b4fc-48d5-b8f4-545ea972697e-utilities\") pod \"redhat-operators-bq6m2\" (UID: \"6c7d53d7-b4fc-48d5-b8f4-545ea972697e\") " pod="openshift-marketplace/redhat-operators-bq6m2" Mar 18 11:58:36 crc kubenswrapper[4965]: I0318 11:58:36.293734 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx2l2\" (UniqueName: \"kubernetes.io/projected/6c7d53d7-b4fc-48d5-b8f4-545ea972697e-kube-api-access-hx2l2\") pod \"redhat-operators-bq6m2\" (UID: \"6c7d53d7-b4fc-48d5-b8f4-545ea972697e\") " pod="openshift-marketplace/redhat-operators-bq6m2" Mar 18 11:58:36 crc kubenswrapper[4965]: I0318 11:58:36.294242 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c7d53d7-b4fc-48d5-b8f4-545ea972697e-utilities\") pod \"redhat-operators-bq6m2\" (UID: \"6c7d53d7-b4fc-48d5-b8f4-545ea972697e\") " pod="openshift-marketplace/redhat-operators-bq6m2" Mar 18 11:58:36 crc kubenswrapper[4965]: I0318 11:58:36.294273 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c7d53d7-b4fc-48d5-b8f4-545ea972697e-catalog-content\") pod \"redhat-operators-bq6m2\" (UID: \"6c7d53d7-b4fc-48d5-b8f4-545ea972697e\") " pod="openshift-marketplace/redhat-operators-bq6m2" Mar 18 11:58:36 crc kubenswrapper[4965]: I0318 11:58:36.295004 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c7d53d7-b4fc-48d5-b8f4-545ea972697e-catalog-content\") pod \"redhat-operators-bq6m2\" (UID: \"6c7d53d7-b4fc-48d5-b8f4-545ea972697e\") " pod="openshift-marketplace/redhat-operators-bq6m2" Mar 18 11:58:36 crc kubenswrapper[4965]: I0318 11:58:36.295382 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rkz24"] Mar 18 11:58:36 crc kubenswrapper[4965]: I0318 11:58:36.295547 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c7d53d7-b4fc-48d5-b8f4-545ea972697e-utilities\") pod \"redhat-operators-bq6m2\" (UID: \"6c7d53d7-b4fc-48d5-b8f4-545ea972697e\") " pod="openshift-marketplace/redhat-operators-bq6m2" Mar 18 11:58:36 crc kubenswrapper[4965]: I0318 11:58:36.313963 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx2l2\" (UniqueName: \"kubernetes.io/projected/6c7d53d7-b4fc-48d5-b8f4-545ea972697e-kube-api-access-hx2l2\") pod \"redhat-operators-bq6m2\" (UID: \"6c7d53d7-b4fc-48d5-b8f4-545ea972697e\") " pod="openshift-marketplace/redhat-operators-bq6m2" Mar 18 11:58:36 crc kubenswrapper[4965]: I0318 11:58:36.344556 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-8pdbx" Mar 18 11:58:36 crc kubenswrapper[4965]: I0318 11:58:36.391230 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zx7g7"] Mar 18 11:58:36 crc kubenswrapper[4965]: W0318 11:58:36.424096 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8edec177_0701_4d1c_bb61_33c0a05df51d.slice/crio-d57b2fc4a17c699444ebe3febc74bacda7beea71728002c70a90d7be5c26c16b WatchSource:0}: Error finding container d57b2fc4a17c699444ebe3febc74bacda7beea71728002c70a90d7be5c26c16b: Status 404 returned error can't find the container with id d57b2fc4a17c699444ebe3febc74bacda7beea71728002c70a90d7be5c26c16b Mar 18 11:58:36 crc kubenswrapper[4965]: I0318 11:58:36.441258 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bq6m2" Mar 18 11:58:36 crc kubenswrapper[4965]: I0318 11:58:36.475195 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 11:58:36 crc kubenswrapper[4965]: W0318 11:58:36.530992 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode50e5ba5_1542_4bdb_9aaa_adff0388ce10.slice/crio-3a580cd92f6a794faec6f702e8c84dba25c15ccae7d2e728730a2b1dcaf3ccc9 WatchSource:0}: Error finding container 3a580cd92f6a794faec6f702e8c84dba25c15ccae7d2e728730a2b1dcaf3ccc9: Status 404 returned error can't find the container with id 3a580cd92f6a794faec6f702e8c84dba25c15ccae7d2e728730a2b1dcaf3ccc9 Mar 18 11:58:36 crc kubenswrapper[4965]: I0318 11:58:36.541333 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rkz24" event={"ID":"7a889591-6c1d-4940-9eb2-28cfa6988f89","Type":"ContainerStarted","Data":"832507c8698dd2e5cfe95eab0c602496a320ad1df64238be63be6e136360019b"} Mar 18 11:58:36 crc kubenswrapper[4965]: I0318 11:58:36.544112 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zx7g7" event={"ID":"8edec177-0701-4d1c-bb61-33c0a05df51d","Type":"ContainerStarted","Data":"d57b2fc4a17c699444ebe3febc74bacda7beea71728002c70a90d7be5c26c16b"} Mar 18 11:58:36 crc kubenswrapper[4965]: I0318 11:58:36.549069 4965 generic.go:334] "Generic (PLEG): container finished" podID="f76b91e1-f768-4cb1-857d-6e3eb31e59f6" containerID="0a996c7c173b97f114336185550cfccc9a982b74e228d1eed57002d0ba0496eb" exitCode=0 Mar 18 11:58:36 crc kubenswrapper[4965]: I0318 11:58:36.549250 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gwpmg" event={"ID":"f76b91e1-f768-4cb1-857d-6e3eb31e59f6","Type":"ContainerDied","Data":"0a996c7c173b97f114336185550cfccc9a982b74e228d1eed57002d0ba0496eb"} Mar 18 11:58:36 crc kubenswrapper[4965]: I0318 11:58:36.549452 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gwpmg" event={"ID":"f76b91e1-f768-4cb1-857d-6e3eb31e59f6","Type":"ContainerStarted","Data":"f5240c39614a6d1944304139b359ff5c416d3939337b76bbf246b2e767b8d847"} Mar 18 11:58:36 crc kubenswrapper[4965]: I0318 11:58:36.554358 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-mmzsn" Mar 18 11:58:36 crc kubenswrapper[4965]: E0318 11:58:36.673884 4965 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a889591_6c1d_4940_9eb2_28cfa6988f89.slice/crio-conmon-f38f7b6adeff507a29063839e67aa83cfb446d1623090d1f78bee27fb36a5b1b.scope\": RecentStats: unable to find data in memory cache]" Mar 18 11:58:36 crc kubenswrapper[4965]: I0318 11:58:36.731581 4965 patch_prober.go:28] interesting pod/router-default-5444994796-xhsv5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 11:58:36 crc kubenswrapper[4965]: [-]has-synced failed: reason withheld Mar 18 11:58:36 crc kubenswrapper[4965]: [+]process-running ok Mar 18 11:58:36 crc kubenswrapper[4965]: healthz check failed Mar 18 11:58:36 crc kubenswrapper[4965]: I0318 11:58:36.731691 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhsv5" podUID="ea2b4f76-5f61-410a-bb32-c1185e630c7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 11:58:36 crc kubenswrapper[4965]: I0318 11:58:36.959679 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563905-l6rnm" Mar 18 11:58:36 crc kubenswrapper[4965]: I0318 11:58:36.982813 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 11:58:37 crc kubenswrapper[4965]: I0318 11:58:37.123196 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2aaf7d7a-c623-4198-b68e-7efa895cb96f-secret-volume\") pod \"2aaf7d7a-c623-4198-b68e-7efa895cb96f\" (UID: \"2aaf7d7a-c623-4198-b68e-7efa895cb96f\") " Mar 18 11:58:37 crc kubenswrapper[4965]: I0318 11:58:37.123299 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/04982060-6c74-4de3-a270-3b3b59bb00e7-kube-api-access\") pod \"04982060-6c74-4de3-a270-3b3b59bb00e7\" (UID: \"04982060-6c74-4de3-a270-3b3b59bb00e7\") " Mar 18 11:58:37 crc kubenswrapper[4965]: I0318 11:58:37.123322 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/04982060-6c74-4de3-a270-3b3b59bb00e7-kubelet-dir\") pod \"04982060-6c74-4de3-a270-3b3b59bb00e7\" (UID: \"04982060-6c74-4de3-a270-3b3b59bb00e7\") " Mar 18 11:58:37 crc kubenswrapper[4965]: I0318 11:58:37.123375 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2aaf7d7a-c623-4198-b68e-7efa895cb96f-config-volume\") pod \"2aaf7d7a-c623-4198-b68e-7efa895cb96f\" (UID: \"2aaf7d7a-c623-4198-b68e-7efa895cb96f\") " Mar 18 11:58:37 crc kubenswrapper[4965]: I0318 11:58:37.123418 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5cjt\" (UniqueName: \"kubernetes.io/projected/2aaf7d7a-c623-4198-b68e-7efa895cb96f-kube-api-access-b5cjt\") pod \"2aaf7d7a-c623-4198-b68e-7efa895cb96f\" (UID: \"2aaf7d7a-c623-4198-b68e-7efa895cb96f\") " Mar 18 11:58:37 crc kubenswrapper[4965]: I0318 11:58:37.128399 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/04982060-6c74-4de3-a270-3b3b59bb00e7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "04982060-6c74-4de3-a270-3b3b59bb00e7" (UID: "04982060-6c74-4de3-a270-3b3b59bb00e7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 11:58:37 crc kubenswrapper[4965]: I0318 11:58:37.128977 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2aaf7d7a-c623-4198-b68e-7efa895cb96f-config-volume" (OuterVolumeSpecName: "config-volume") pod "2aaf7d7a-c623-4198-b68e-7efa895cb96f" (UID: "2aaf7d7a-c623-4198-b68e-7efa895cb96f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:58:37 crc kubenswrapper[4965]: I0318 11:58:37.132878 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04982060-6c74-4de3-a270-3b3b59bb00e7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "04982060-6c74-4de3-a270-3b3b59bb00e7" (UID: "04982060-6c74-4de3-a270-3b3b59bb00e7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:37 crc kubenswrapper[4965]: I0318 11:58:37.134126 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2aaf7d7a-c623-4198-b68e-7efa895cb96f-kube-api-access-b5cjt" (OuterVolumeSpecName: "kube-api-access-b5cjt") pod "2aaf7d7a-c623-4198-b68e-7efa895cb96f" (UID: "2aaf7d7a-c623-4198-b68e-7efa895cb96f"). InnerVolumeSpecName "kube-api-access-b5cjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:37 crc kubenswrapper[4965]: I0318 11:58:37.134538 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2aaf7d7a-c623-4198-b68e-7efa895cb96f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2aaf7d7a-c623-4198-b68e-7efa895cb96f" (UID: "2aaf7d7a-c623-4198-b68e-7efa895cb96f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:58:37 crc kubenswrapper[4965]: I0318 11:58:37.152951 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bq6m2"] Mar 18 11:58:37 crc kubenswrapper[4965]: W0318 11:58:37.181304 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c7d53d7_b4fc_48d5_b8f4_545ea972697e.slice/crio-5aff5bb0025284ff396efdc2e6d96cddd21b628119bdc55a0148aab0566e4147 WatchSource:0}: Error finding container 5aff5bb0025284ff396efdc2e6d96cddd21b628119bdc55a0148aab0566e4147: Status 404 returned error can't find the container with id 5aff5bb0025284ff396efdc2e6d96cddd21b628119bdc55a0148aab0566e4147 Mar 18 11:58:37 crc kubenswrapper[4965]: I0318 11:58:37.225770 4965 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2aaf7d7a-c623-4198-b68e-7efa895cb96f-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:37 crc kubenswrapper[4965]: I0318 11:58:37.226194 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5cjt\" (UniqueName: \"kubernetes.io/projected/2aaf7d7a-c623-4198-b68e-7efa895cb96f-kube-api-access-b5cjt\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:37 crc kubenswrapper[4965]: I0318 11:58:37.226210 4965 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2aaf7d7a-c623-4198-b68e-7efa895cb96f-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:37 crc kubenswrapper[4965]: I0318 11:58:37.226222 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/04982060-6c74-4de3-a270-3b3b59bb00e7-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:37 crc kubenswrapper[4965]: I0318 11:58:37.226252 4965 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/04982060-6c74-4de3-a270-3b3b59bb00e7-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:37 crc kubenswrapper[4965]: I0318 11:58:37.574083 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e50e5ba5-1542-4bdb-9aaa-adff0388ce10","Type":"ContainerStarted","Data":"08be702a56dd2a4fe0882e6e7d8bfa5bea876a52d586a5a70b62752781b3f9a7"} Mar 18 11:58:37 crc kubenswrapper[4965]: I0318 11:58:37.574126 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e50e5ba5-1542-4bdb-9aaa-adff0388ce10","Type":"ContainerStarted","Data":"3a580cd92f6a794faec6f702e8c84dba25c15ccae7d2e728730a2b1dcaf3ccc9"} Mar 18 11:58:37 crc kubenswrapper[4965]: I0318 11:58:37.579077 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"04982060-6c74-4de3-a270-3b3b59bb00e7","Type":"ContainerDied","Data":"f6395e84cfbab9983b7b496eb53d7e17ab96b58a7d1c6f2823c1f91cc138b3de"} Mar 18 11:58:37 crc kubenswrapper[4965]: I0318 11:58:37.579148 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6395e84cfbab9983b7b496eb53d7e17ab96b58a7d1c6f2823c1f91cc138b3de" Mar 18 11:58:37 crc kubenswrapper[4965]: I0318 11:58:37.579115 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 11:58:37 crc kubenswrapper[4965]: I0318 11:58:37.581792 4965 generic.go:334] "Generic (PLEG): container finished" podID="7a889591-6c1d-4940-9eb2-28cfa6988f89" containerID="f38f7b6adeff507a29063839e67aa83cfb446d1623090d1f78bee27fb36a5b1b" exitCode=0 Mar 18 11:58:37 crc kubenswrapper[4965]: I0318 11:58:37.581850 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rkz24" event={"ID":"7a889591-6c1d-4940-9eb2-28cfa6988f89","Type":"ContainerDied","Data":"f38f7b6adeff507a29063839e67aa83cfb446d1623090d1f78bee27fb36a5b1b"} Mar 18 11:58:37 crc kubenswrapper[4965]: I0318 11:58:37.584103 4965 generic.go:334] "Generic (PLEG): container finished" podID="8edec177-0701-4d1c-bb61-33c0a05df51d" containerID="e1931fa95c6a701f734dd58e67655754465994159603a5bb3d726ae9e7141e4b" exitCode=0 Mar 18 11:58:37 crc kubenswrapper[4965]: I0318 11:58:37.584174 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zx7g7" event={"ID":"8edec177-0701-4d1c-bb61-33c0a05df51d","Type":"ContainerDied","Data":"e1931fa95c6a701f734dd58e67655754465994159603a5bb3d726ae9e7141e4b"} Mar 18 11:58:37 crc kubenswrapper[4965]: I0318 11:58:37.590076 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563905-l6rnm" event={"ID":"2aaf7d7a-c623-4198-b68e-7efa895cb96f","Type":"ContainerDied","Data":"efc9f89a90861ced30a0f130b5fd9c8b0d25603a681a3a7a811e0b54f1245446"} Mar 18 11:58:37 crc kubenswrapper[4965]: I0318 11:58:37.590106 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efc9f89a90861ced30a0f130b5fd9c8b0d25603a681a3a7a811e0b54f1245446" Mar 18 11:58:37 crc kubenswrapper[4965]: I0318 11:58:37.590151 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563905-l6rnm" Mar 18 11:58:37 crc kubenswrapper[4965]: I0318 11:58:37.595829 4965 generic.go:334] "Generic (PLEG): container finished" podID="6c7d53d7-b4fc-48d5-b8f4-545ea972697e" containerID="b5b6aebee14d591b0ff69b46c82bba28db042ce8a323508fa411429f5bc2ed92" exitCode=0 Mar 18 11:58:37 crc kubenswrapper[4965]: I0318 11:58:37.595883 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bq6m2" event={"ID":"6c7d53d7-b4fc-48d5-b8f4-545ea972697e","Type":"ContainerDied","Data":"b5b6aebee14d591b0ff69b46c82bba28db042ce8a323508fa411429f5bc2ed92"} Mar 18 11:58:37 crc kubenswrapper[4965]: I0318 11:58:37.595925 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bq6m2" event={"ID":"6c7d53d7-b4fc-48d5-b8f4-545ea972697e","Type":"ContainerStarted","Data":"5aff5bb0025284ff396efdc2e6d96cddd21b628119bdc55a0148aab0566e4147"} Mar 18 11:58:37 crc kubenswrapper[4965]: I0318 11:58:37.596777 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.5967635700000002 podStartE2EDuration="2.59676357s" podCreationTimestamp="2026-03-18 11:58:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:37.590631651 +0000 UTC m=+122.576819130" watchObservedRunningTime="2026-03-18 11:58:37.59676357 +0000 UTC m=+122.582951049" Mar 18 11:58:37 crc kubenswrapper[4965]: I0318 11:58:37.730154 4965 patch_prober.go:28] interesting pod/router-default-5444994796-xhsv5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 11:58:37 crc kubenswrapper[4965]: [-]has-synced failed: reason withheld Mar 18 11:58:37 crc kubenswrapper[4965]: [+]process-running ok Mar 18 11:58:37 crc kubenswrapper[4965]: healthz check failed Mar 18 11:58:37 crc kubenswrapper[4965]: I0318 11:58:37.730217 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhsv5" podUID="ea2b4f76-5f61-410a-bb32-c1185e630c7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 11:58:38 crc kubenswrapper[4965]: I0318 11:58:38.034421 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 18 11:58:38 crc kubenswrapper[4965]: I0318 11:58:38.621746 4965 generic.go:334] "Generic (PLEG): container finished" podID="e50e5ba5-1542-4bdb-9aaa-adff0388ce10" containerID="08be702a56dd2a4fe0882e6e7d8bfa5bea876a52d586a5a70b62752781b3f9a7" exitCode=0 Mar 18 11:58:38 crc kubenswrapper[4965]: I0318 11:58:38.621846 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e50e5ba5-1542-4bdb-9aaa-adff0388ce10","Type":"ContainerDied","Data":"08be702a56dd2a4fe0882e6e7d8bfa5bea876a52d586a5a70b62752781b3f9a7"} Mar 18 11:58:38 crc kubenswrapper[4965]: I0318 11:58:38.662297 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=0.662274798 podStartE2EDuration="662.274798ms" podCreationTimestamp="2026-03-18 11:58:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:38.658318039 +0000 UTC m=+123.644505518" watchObservedRunningTime="2026-03-18 11:58:38.662274798 +0000 UTC m=+123.648462277" Mar 18 11:58:38 crc kubenswrapper[4965]: I0318 11:58:38.725208 4965 patch_prober.go:28] interesting pod/router-default-5444994796-xhsv5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 11:58:38 crc kubenswrapper[4965]: [-]has-synced failed: reason withheld Mar 18 11:58:38 crc kubenswrapper[4965]: [+]process-running ok Mar 18 11:58:38 crc kubenswrapper[4965]: healthz check failed Mar 18 11:58:38 crc kubenswrapper[4965]: I0318 11:58:38.725279 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhsv5" podUID="ea2b4f76-5f61-410a-bb32-c1185e630c7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 11:58:38 crc kubenswrapper[4965]: I0318 11:58:38.853841 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 11:58:38 crc kubenswrapper[4965]: I0318 11:58:38.853937 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 11:58:38 crc kubenswrapper[4965]: I0318 11:58:38.853974 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 11:58:38 crc kubenswrapper[4965]: I0318 11:58:38.854030 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 11:58:38 crc kubenswrapper[4965]: I0318 11:58:38.855924 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 18 11:58:38 crc kubenswrapper[4965]: I0318 11:58:38.856129 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 18 11:58:38 crc kubenswrapper[4965]: I0318 11:58:38.856641 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 18 11:58:38 crc kubenswrapper[4965]: I0318 11:58:38.866493 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 11:58:38 crc kubenswrapper[4965]: I0318 11:58:38.866784 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 18 11:58:38 crc kubenswrapper[4965]: I0318 11:58:38.871808 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 11:58:38 crc kubenswrapper[4965]: I0318 11:58:38.878934 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 11:58:38 crc kubenswrapper[4965]: I0318 11:58:38.880965 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 11:58:38 crc kubenswrapper[4965]: I0318 11:58:38.948720 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 11:58:38 crc kubenswrapper[4965]: I0318 11:58:38.956501 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 11:58:38 crc kubenswrapper[4965]: I0318 11:58:38.971958 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 11:58:39 crc kubenswrapper[4965]: I0318 11:58:39.726798 4965 patch_prober.go:28] interesting pod/router-default-5444994796-xhsv5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 11:58:39 crc kubenswrapper[4965]: [-]has-synced failed: reason withheld Mar 18 11:58:39 crc kubenswrapper[4965]: [+]process-running ok Mar 18 11:58:39 crc kubenswrapper[4965]: healthz check failed Mar 18 11:58:39 crc kubenswrapper[4965]: I0318 11:58:39.727147 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhsv5" podUID="ea2b4f76-5f61-410a-bb32-c1185e630c7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 11:58:39 crc kubenswrapper[4965]: W0318 11:58:39.804758 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-1099cec55875e96eefa6517c85dfb6147888c0fe07292ccb863cab992e3fe078 WatchSource:0}: Error finding container 1099cec55875e96eefa6517c85dfb6147888c0fe07292ccb863cab992e3fe078: Status 404 returned error can't find the container with id 1099cec55875e96eefa6517c85dfb6147888c0fe07292ccb863cab992e3fe078 Mar 18 11:58:40 crc kubenswrapper[4965]: I0318 11:58:40.108509 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 11:58:40 crc kubenswrapper[4965]: I0318 11:58:40.273275 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e50e5ba5-1542-4bdb-9aaa-adff0388ce10-kube-api-access\") pod \"e50e5ba5-1542-4bdb-9aaa-adff0388ce10\" (UID: \"e50e5ba5-1542-4bdb-9aaa-adff0388ce10\") " Mar 18 11:58:40 crc kubenswrapper[4965]: I0318 11:58:40.273643 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e50e5ba5-1542-4bdb-9aaa-adff0388ce10-kubelet-dir\") pod \"e50e5ba5-1542-4bdb-9aaa-adff0388ce10\" (UID: \"e50e5ba5-1542-4bdb-9aaa-adff0388ce10\") " Mar 18 11:58:40 crc kubenswrapper[4965]: I0318 11:58:40.273782 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e50e5ba5-1542-4bdb-9aaa-adff0388ce10-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e50e5ba5-1542-4bdb-9aaa-adff0388ce10" (UID: "e50e5ba5-1542-4bdb-9aaa-adff0388ce10"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 11:58:40 crc kubenswrapper[4965]: I0318 11:58:40.274897 4965 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e50e5ba5-1542-4bdb-9aaa-adff0388ce10-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:40 crc kubenswrapper[4965]: I0318 11:58:40.279928 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e50e5ba5-1542-4bdb-9aaa-adff0388ce10-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e50e5ba5-1542-4bdb-9aaa-adff0388ce10" (UID: "e50e5ba5-1542-4bdb-9aaa-adff0388ce10"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:58:40 crc kubenswrapper[4965]: I0318 11:58:40.375820 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e50e5ba5-1542-4bdb-9aaa-adff0388ce10-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 11:58:40 crc kubenswrapper[4965]: I0318 11:58:40.663840 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b68547d78ee00db7569c9c2a7e97dff4cd86daaa1266760fb433b14fe70baf78"} Mar 18 11:58:40 crc kubenswrapper[4965]: I0318 11:58:40.663881 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"1099cec55875e96eefa6517c85dfb6147888c0fe07292ccb863cab992e3fe078"} Mar 18 11:58:40 crc kubenswrapper[4965]: I0318 11:58:40.667664 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e50e5ba5-1542-4bdb-9aaa-adff0388ce10","Type":"ContainerDied","Data":"3a580cd92f6a794faec6f702e8c84dba25c15ccae7d2e728730a2b1dcaf3ccc9"} Mar 18 11:58:40 crc kubenswrapper[4965]: I0318 11:58:40.667707 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a580cd92f6a794faec6f702e8c84dba25c15ccae7d2e728730a2b1dcaf3ccc9" Mar 18 11:58:40 crc kubenswrapper[4965]: I0318 11:58:40.667799 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 11:58:40 crc kubenswrapper[4965]: I0318 11:58:40.691202 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"3f9d337210738d6e2a57db34c92d57c85aa81f8fbfdd5cc9aca3c37a086113e2"} Mar 18 11:58:40 crc kubenswrapper[4965]: I0318 11:58:40.691252 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"713cea380fe21355b732a4a371146bb7f0e94fd94f1be06faf16b25ee887d38d"} Mar 18 11:58:40 crc kubenswrapper[4965]: I0318 11:58:40.698590 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"8f660e19f79859d4c785089a64af49c7a9ea7e8ba4c0bd4cf7a48103d83daf7a"} Mar 18 11:58:40 crc kubenswrapper[4965]: I0318 11:58:40.698652 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"b90e4046e44fa5120a0e7655c9e7eb4bd4b985abff582cef8983db6b25ed13de"} Mar 18 11:58:40 crc kubenswrapper[4965]: I0318 11:58:40.698960 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 11:58:40 crc kubenswrapper[4965]: I0318 11:58:40.731148 4965 patch_prober.go:28] interesting pod/router-default-5444994796-xhsv5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 11:58:40 crc kubenswrapper[4965]: [-]has-synced failed: reason withheld Mar 18 11:58:40 crc kubenswrapper[4965]: [+]process-running ok Mar 18 11:58:40 crc kubenswrapper[4965]: healthz check failed Mar 18 11:58:40 crc kubenswrapper[4965]: I0318 11:58:40.731201 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhsv5" podUID="ea2b4f76-5f61-410a-bb32-c1185e630c7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 11:58:40 crc kubenswrapper[4965]: I0318 11:58:40.830064 4965 ???:1] "http: TLS handshake error from 192.168.126.11:47662: no serving certificate available for the kubelet" Mar 18 11:58:41 crc kubenswrapper[4965]: I0318 11:58:41.472334 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-w9kwc" Mar 18 11:58:41 crc kubenswrapper[4965]: I0318 11:58:41.725315 4965 patch_prober.go:28] interesting pod/router-default-5444994796-xhsv5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 11:58:41 crc kubenswrapper[4965]: [-]has-synced failed: reason withheld Mar 18 11:58:41 crc kubenswrapper[4965]: [+]process-running ok Mar 18 11:58:41 crc kubenswrapper[4965]: healthz check failed Mar 18 11:58:41 crc kubenswrapper[4965]: I0318 11:58:41.725375 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhsv5" podUID="ea2b4f76-5f61-410a-bb32-c1185e630c7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 11:58:41 crc kubenswrapper[4965]: I0318 11:58:41.840021 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ts942" Mar 18 11:58:42 crc kubenswrapper[4965]: I0318 11:58:42.510483 4965 ???:1] "http: TLS handshake error from 192.168.126.11:47672: no serving certificate available for the kubelet" Mar 18 11:58:42 crc kubenswrapper[4965]: I0318 11:58:42.725406 4965 patch_prober.go:28] interesting pod/router-default-5444994796-xhsv5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 11:58:42 crc kubenswrapper[4965]: [-]has-synced failed: reason withheld Mar 18 11:58:42 crc kubenswrapper[4965]: [+]process-running ok Mar 18 11:58:42 crc kubenswrapper[4965]: healthz check failed Mar 18 11:58:42 crc kubenswrapper[4965]: I0318 11:58:42.725459 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhsv5" podUID="ea2b4f76-5f61-410a-bb32-c1185e630c7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 11:58:43 crc kubenswrapper[4965]: I0318 11:58:43.725113 4965 patch_prober.go:28] interesting pod/router-default-5444994796-xhsv5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 11:58:43 crc kubenswrapper[4965]: [-]has-synced failed: reason withheld Mar 18 11:58:43 crc kubenswrapper[4965]: [+]process-running ok Mar 18 11:58:43 crc kubenswrapper[4965]: healthz check failed Mar 18 11:58:43 crc kubenswrapper[4965]: I0318 11:58:43.725167 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhsv5" podUID="ea2b4f76-5f61-410a-bb32-c1185e630c7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 11:58:44 crc kubenswrapper[4965]: I0318 11:58:44.731988 4965 patch_prober.go:28] interesting pod/router-default-5444994796-xhsv5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 11:58:44 crc kubenswrapper[4965]: [-]has-synced failed: reason withheld Mar 18 11:58:44 crc kubenswrapper[4965]: [+]process-running ok Mar 18 11:58:44 crc kubenswrapper[4965]: healthz check failed Mar 18 11:58:44 crc kubenswrapper[4965]: I0318 11:58:44.732059 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhsv5" podUID="ea2b4f76-5f61-410a-bb32-c1185e630c7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 11:58:44 crc kubenswrapper[4965]: I0318 11:58:44.798760 4965 patch_prober.go:28] interesting pod/console-f9d7485db-pj8km container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.6:8443/health\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 18 11:58:44 crc kubenswrapper[4965]: I0318 11:58:44.798820 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-pj8km" podUID="bf0cad60-78bb-4325-aaaa-ee2636410fcb" containerName="console" probeResult="failure" output="Get \"https://10.217.0.6:8443/health\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 18 11:58:45 crc kubenswrapper[4965]: I0318 11:58:45.726296 4965 patch_prober.go:28] interesting pod/router-default-5444994796-xhsv5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 11:58:45 crc kubenswrapper[4965]: [-]has-synced failed: reason withheld Mar 18 11:58:45 crc kubenswrapper[4965]: [+]process-running ok Mar 18 11:58:45 crc kubenswrapper[4965]: healthz check failed Mar 18 11:58:45 crc kubenswrapper[4965]: I0318 11:58:45.726343 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhsv5" podUID="ea2b4f76-5f61-410a-bb32-c1185e630c7a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 11:58:46 crc kubenswrapper[4965]: I0318 11:58:46.109889 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-q9zxt" Mar 18 11:58:46 crc kubenswrapper[4965]: E0318 11:58:46.157581 4965 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3f72bea0b0230bd5d6e3e34312b81820002f622700a1b0b20f09e43e4a37e2ad" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 11:58:46 crc kubenswrapper[4965]: E0318 11:58:46.159976 4965 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3f72bea0b0230bd5d6e3e34312b81820002f622700a1b0b20f09e43e4a37e2ad" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 11:58:46 crc kubenswrapper[4965]: E0318 11:58:46.162221 4965 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3f72bea0b0230bd5d6e3e34312b81820002f622700a1b0b20f09e43e4a37e2ad" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 11:58:46 crc kubenswrapper[4965]: E0318 11:58:46.162257 4965 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-5scj2" podUID="ccc8d88d-863a-43b2-916d-b9df5a38453d" containerName="kube-multus-additional-cni-plugins" Mar 18 11:58:46 crc kubenswrapper[4965]: I0318 11:58:46.725344 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-xhsv5" Mar 18 11:58:46 crc kubenswrapper[4965]: I0318 11:58:46.727616 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-xhsv5" Mar 18 11:58:48 crc kubenswrapper[4965]: I0318 11:58:48.144876 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-768f74964f-q9448"] Mar 18 11:58:48 crc kubenswrapper[4965]: I0318 11:58:48.145420 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-768f74964f-q9448" podUID="41c127cf-02c5-45cd-bf65-5015dd07bd7f" containerName="controller-manager" containerID="cri-o://234b60652eb06cf5c71cd97027084cab71bc15881eb956b6ffa36efe073190b3" gracePeriod=30 Mar 18 11:58:48 crc kubenswrapper[4965]: I0318 11:58:48.162888 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-774bccfc87-tv2nq"] Mar 18 11:58:48 crc kubenswrapper[4965]: I0318 11:58:48.163129 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-774bccfc87-tv2nq" podUID="d3962338-1df0-4e65-a1cf-d1860a48a18c" containerName="route-controller-manager" containerID="cri-o://138559671c5a520589e780f12dae9c01c06c8b519221970cf7ac3c7e4e6df452" gracePeriod=30 Mar 18 11:58:48 crc kubenswrapper[4965]: I0318 11:58:48.759331 4965 generic.go:334] "Generic (PLEG): container finished" podID="d3962338-1df0-4e65-a1cf-d1860a48a18c" containerID="138559671c5a520589e780f12dae9c01c06c8b519221970cf7ac3c7e4e6df452" exitCode=0 Mar 18 11:58:48 crc kubenswrapper[4965]: I0318 11:58:48.759414 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-774bccfc87-tv2nq" event={"ID":"d3962338-1df0-4e65-a1cf-d1860a48a18c","Type":"ContainerDied","Data":"138559671c5a520589e780f12dae9c01c06c8b519221970cf7ac3c7e4e6df452"} Mar 18 11:58:48 crc kubenswrapper[4965]: I0318 11:58:48.761116 4965 generic.go:334] "Generic (PLEG): container finished" podID="41c127cf-02c5-45cd-bf65-5015dd07bd7f" containerID="234b60652eb06cf5c71cd97027084cab71bc15881eb956b6ffa36efe073190b3" exitCode=0 Mar 18 11:58:48 crc kubenswrapper[4965]: I0318 11:58:48.761140 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-768f74964f-q9448" event={"ID":"41c127cf-02c5-45cd-bf65-5015dd07bd7f","Type":"ContainerDied","Data":"234b60652eb06cf5c71cd97027084cab71bc15881eb956b6ffa36efe073190b3"} Mar 18 11:58:51 crc kubenswrapper[4965]: I0318 11:58:51.096074 4965 ???:1] "http: TLS handshake error from 192.168.126.11:54426: no serving certificate available for the kubelet" Mar 18 11:58:52 crc kubenswrapper[4965]: I0318 11:58:52.041040 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 18 11:58:52 crc kubenswrapper[4965]: I0318 11:58:52.612497 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 11:58:52 crc kubenswrapper[4965]: I0318 11:58:52.626042 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=0.626028095 podStartE2EDuration="626.028095ms" podCreationTimestamp="2026-03-18 11:58:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:52.623022838 +0000 UTC m=+137.609210317" watchObservedRunningTime="2026-03-18 11:58:52.626028095 +0000 UTC m=+137.612215574" Mar 18 11:58:53 crc kubenswrapper[4965]: I0318 11:58:53.035730 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 18 11:58:53 crc kubenswrapper[4965]: I0318 11:58:53.192366 4965 patch_prober.go:28] interesting pod/controller-manager-768f74964f-q9448 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.44:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 11:58:53 crc kubenswrapper[4965]: I0318 11:58:53.192771 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-768f74964f-q9448" podUID="41c127cf-02c5-45cd-bf65-5015dd07bd7f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.44:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 11:58:54 crc kubenswrapper[4965]: I0318 11:58:54.888703 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-pj8km" Mar 18 11:58:54 crc kubenswrapper[4965]: I0318 11:58:54.898374 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-pj8km" Mar 18 11:58:54 crc kubenswrapper[4965]: I0318 11:58:54.923476 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=1.92345822 podStartE2EDuration="1.92345822s" podCreationTimestamp="2026-03-18 11:58:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:58:54.921315128 +0000 UTC m=+139.907502607" watchObservedRunningTime="2026-03-18 11:58:54.92345822 +0000 UTC m=+139.909645699" Mar 18 11:58:55 crc kubenswrapper[4965]: I0318 11:58:55.653569 4965 patch_prober.go:28] interesting pod/route-controller-manager-774bccfc87-tv2nq container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.50:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 11:58:55 crc kubenswrapper[4965]: I0318 11:58:55.653642 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-774bccfc87-tv2nq" podUID="d3962338-1df0-4e65-a1cf-d1860a48a18c" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.50:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 11:58:56 crc kubenswrapper[4965]: E0318 11:58:56.156442 4965 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3f72bea0b0230bd5d6e3e34312b81820002f622700a1b0b20f09e43e4a37e2ad" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 11:58:56 crc kubenswrapper[4965]: E0318 11:58:56.158059 4965 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3f72bea0b0230bd5d6e3e34312b81820002f622700a1b0b20f09e43e4a37e2ad" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 11:58:56 crc kubenswrapper[4965]: E0318 11:58:56.159592 4965 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3f72bea0b0230bd5d6e3e34312b81820002f622700a1b0b20f09e43e4a37e2ad" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 11:58:56 crc kubenswrapper[4965]: E0318 11:58:56.159629 4965 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-5scj2" podUID="ccc8d88d-863a-43b2-916d-b9df5a38453d" containerName="kube-multus-additional-cni-plugins" Mar 18 11:58:58 crc kubenswrapper[4965]: E0318 11:58:58.228853 4965 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 18 11:58:58 crc kubenswrapper[4965]: E0318 11:58:58.229203 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h6wdl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-9btzt_openshift-marketplace(11860b27-ab66-423c-9939-df595c023a38): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 11:58:58 crc kubenswrapper[4965]: E0318 11:58:58.230471 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-9btzt" podUID="11860b27-ab66-423c-9939-df595c023a38" Mar 18 11:59:01 crc kubenswrapper[4965]: E0318 11:59:01.526768 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-9btzt" podUID="11860b27-ab66-423c-9939-df595c023a38" Mar 18 11:59:01 crc kubenswrapper[4965]: E0318 11:59:01.602374 4965 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 18 11:59:01 crc kubenswrapper[4965]: E0318 11:59:01.602995 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bxtzh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-zx7g7_openshift-marketplace(8edec177-0701-4d1c-bb61-33c0a05df51d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 11:59:01 crc kubenswrapper[4965]: E0318 11:59:01.604171 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-zx7g7" podUID="8edec177-0701-4d1c-bb61-33c0a05df51d" Mar 18 11:59:01 crc kubenswrapper[4965]: I0318 11:59:01.832669 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-5scj2_ccc8d88d-863a-43b2-916d-b9df5a38453d/kube-multus-additional-cni-plugins/0.log" Mar 18 11:59:01 crc kubenswrapper[4965]: I0318 11:59:01.832712 4965 generic.go:334] "Generic (PLEG): container finished" podID="ccc8d88d-863a-43b2-916d-b9df5a38453d" containerID="3f72bea0b0230bd5d6e3e34312b81820002f622700a1b0b20f09e43e4a37e2ad" exitCode=137 Mar 18 11:59:01 crc kubenswrapper[4965]: I0318 11:59:01.832795 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-5scj2" event={"ID":"ccc8d88d-863a-43b2-916d-b9df5a38453d","Type":"ContainerDied","Data":"3f72bea0b0230bd5d6e3e34312b81820002f622700a1b0b20f09e43e4a37e2ad"} Mar 18 11:59:02 crc kubenswrapper[4965]: E0318 11:59:02.940986 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zx7g7" podUID="8edec177-0701-4d1c-bb61-33c0a05df51d" Mar 18 11:59:02 crc kubenswrapper[4965]: I0318 11:59:02.990822 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-774bccfc87-tv2nq" Mar 18 11:59:02 crc kubenswrapper[4965]: I0318 11:59:02.994649 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-768f74964f-q9448" Mar 18 11:59:02 crc kubenswrapper[4965]: I0318 11:59:02.999740 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-5scj2_ccc8d88d-863a-43b2-916d-b9df5a38453d/kube-multus-additional-cni-plugins/0.log" Mar 18 11:59:02 crc kubenswrapper[4965]: I0318 11:59:02.999803 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-5scj2" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.026025 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f6995df46-nmlhx"] Mar 18 11:59:03 crc kubenswrapper[4965]: E0318 11:59:03.029001 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e50e5ba5-1542-4bdb-9aaa-adff0388ce10" containerName="pruner" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.029023 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="e50e5ba5-1542-4bdb-9aaa-adff0388ce10" containerName="pruner" Mar 18 11:59:03 crc kubenswrapper[4965]: E0318 11:59:03.029067 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccc8d88d-863a-43b2-916d-b9df5a38453d" containerName="kube-multus-additional-cni-plugins" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.029077 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccc8d88d-863a-43b2-916d-b9df5a38453d" containerName="kube-multus-additional-cni-plugins" Mar 18 11:59:03 crc kubenswrapper[4965]: E0318 11:59:03.029091 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aaf7d7a-c623-4198-b68e-7efa895cb96f" containerName="collect-profiles" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.029101 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aaf7d7a-c623-4198-b68e-7efa895cb96f" containerName="collect-profiles" Mar 18 11:59:03 crc kubenswrapper[4965]: E0318 11:59:03.029142 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04982060-6c74-4de3-a270-3b3b59bb00e7" containerName="pruner" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.029152 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="04982060-6c74-4de3-a270-3b3b59bb00e7" containerName="pruner" Mar 18 11:59:03 crc kubenswrapper[4965]: E0318 11:59:03.029164 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41c127cf-02c5-45cd-bf65-5015dd07bd7f" containerName="controller-manager" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.029174 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="41c127cf-02c5-45cd-bf65-5015dd07bd7f" containerName="controller-manager" Mar 18 11:59:03 crc kubenswrapper[4965]: E0318 11:59:03.029184 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3962338-1df0-4e65-a1cf-d1860a48a18c" containerName="route-controller-manager" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.029194 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3962338-1df0-4e65-a1cf-d1860a48a18c" containerName="route-controller-manager" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.029374 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aaf7d7a-c623-4198-b68e-7efa895cb96f" containerName="collect-profiles" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.029407 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccc8d88d-863a-43b2-916d-b9df5a38453d" containerName="kube-multus-additional-cni-plugins" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.029420 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="04982060-6c74-4de3-a270-3b3b59bb00e7" containerName="pruner" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.029460 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="41c127cf-02c5-45cd-bf65-5015dd07bd7f" containerName="controller-manager" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.029474 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3962338-1df0-4e65-a1cf-d1860a48a18c" containerName="route-controller-manager" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.029484 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="e50e5ba5-1542-4bdb-9aaa-adff0388ce10" containerName="pruner" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.031852 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f6995df46-nmlhx" Mar 18 11:59:03 crc kubenswrapper[4965]: E0318 11:59:03.032023 4965 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 18 11:59:03 crc kubenswrapper[4965]: E0318 11:59:03.032287 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tfc57,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-6pk8b_openshift-marketplace(4329347a-487b-4012-a715-9565bc4e67d0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 11:59:03 crc kubenswrapper[4965]: E0318 11:59:03.033942 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-6pk8b" podUID="4329347a-487b-4012-a715-9565bc4e67d0" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.048194 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f6995df46-nmlhx"] Mar 18 11:59:03 crc kubenswrapper[4965]: E0318 11:59:03.048563 4965 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 18 11:59:03 crc kubenswrapper[4965]: E0318 11:59:03.048690 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pww9d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-gwpmg_openshift-marketplace(f76b91e1-f768-4cb1-857d-6e3eb31e59f6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 11:59:03 crc kubenswrapper[4965]: E0318 11:59:03.051245 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-gwpmg" podUID="f76b91e1-f768-4cb1-857d-6e3eb31e59f6" Mar 18 11:59:03 crc kubenswrapper[4965]: E0318 11:59:03.066945 4965 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 18 11:59:03 crc kubenswrapper[4965]: E0318 11:59:03.067069 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pvjm4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-rkz24_openshift-marketplace(7a889591-6c1d-4940-9eb2-28cfa6988f89): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 11:59:03 crc kubenswrapper[4965]: E0318 11:59:03.068195 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-rkz24" podUID="7a889591-6c1d-4940-9eb2-28cfa6988f89" Mar 18 11:59:03 crc kubenswrapper[4965]: E0318 11:59:03.114844 4965 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 18 11:59:03 crc kubenswrapper[4965]: E0318 11:59:03.115382 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hx2l2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-bq6m2_openshift-marketplace(6c7d53d7-b4fc-48d5-b8f4-545ea972697e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 11:59:03 crc kubenswrapper[4965]: E0318 11:59:03.116792 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-bq6m2" podUID="6c7d53d7-b4fc-48d5-b8f4-545ea972697e" Mar 18 11:59:03 crc kubenswrapper[4965]: E0318 11:59:03.133057 4965 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 18 11:59:03 crc kubenswrapper[4965]: E0318 11:59:03.133212 4965 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-895zv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-5wjlx_openshift-marketplace(39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 11:59:03 crc kubenswrapper[4965]: E0318 11:59:03.134415 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-5wjlx" podUID="39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.150540 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3962338-1df0-4e65-a1cf-d1860a48a18c-serving-cert\") pod \"d3962338-1df0-4e65-a1cf-d1860a48a18c\" (UID: \"d3962338-1df0-4e65-a1cf-d1860a48a18c\") " Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.150911 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/ccc8d88d-863a-43b2-916d-b9df5a38453d-ready\") pod \"ccc8d88d-863a-43b2-916d-b9df5a38453d\" (UID: \"ccc8d88d-863a-43b2-916d-b9df5a38453d\") " Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.150943 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3962338-1df0-4e65-a1cf-d1860a48a18c-config\") pod \"d3962338-1df0-4e65-a1cf-d1860a48a18c\" (UID: \"d3962338-1df0-4e65-a1cf-d1860a48a18c\") " Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.150972 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/41c127cf-02c5-45cd-bf65-5015dd07bd7f-client-ca\") pod \"41c127cf-02c5-45cd-bf65-5015dd07bd7f\" (UID: \"41c127cf-02c5-45cd-bf65-5015dd07bd7f\") " Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.151025 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k66lh\" (UniqueName: \"kubernetes.io/projected/d3962338-1df0-4e65-a1cf-d1860a48a18c-kube-api-access-k66lh\") pod \"d3962338-1df0-4e65-a1cf-d1860a48a18c\" (UID: \"d3962338-1df0-4e65-a1cf-d1860a48a18c\") " Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.151060 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vww9\" (UniqueName: \"kubernetes.io/projected/41c127cf-02c5-45cd-bf65-5015dd07bd7f-kube-api-access-5vww9\") pod \"41c127cf-02c5-45cd-bf65-5015dd07bd7f\" (UID: \"41c127cf-02c5-45cd-bf65-5015dd07bd7f\") " Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.151096 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41c127cf-02c5-45cd-bf65-5015dd07bd7f-serving-cert\") pod \"41c127cf-02c5-45cd-bf65-5015dd07bd7f\" (UID: \"41c127cf-02c5-45cd-bf65-5015dd07bd7f\") " Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.151123 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ccc8d88d-863a-43b2-916d-b9df5a38453d-tuning-conf-dir\") pod \"ccc8d88d-863a-43b2-916d-b9df5a38453d\" (UID: \"ccc8d88d-863a-43b2-916d-b9df5a38453d\") " Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.151153 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ccc8d88d-863a-43b2-916d-b9df5a38453d-cni-sysctl-allowlist\") pod \"ccc8d88d-863a-43b2-916d-b9df5a38453d\" (UID: \"ccc8d88d-863a-43b2-916d-b9df5a38453d\") " Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.151180 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d3962338-1df0-4e65-a1cf-d1860a48a18c-client-ca\") pod \"d3962338-1df0-4e65-a1cf-d1860a48a18c\" (UID: \"d3962338-1df0-4e65-a1cf-d1860a48a18c\") " Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.151232 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cm28w\" (UniqueName: \"kubernetes.io/projected/ccc8d88d-863a-43b2-916d-b9df5a38453d-kube-api-access-cm28w\") pod \"ccc8d88d-863a-43b2-916d-b9df5a38453d\" (UID: \"ccc8d88d-863a-43b2-916d-b9df5a38453d\") " Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.151263 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/41c127cf-02c5-45cd-bf65-5015dd07bd7f-proxy-ca-bundles\") pod \"41c127cf-02c5-45cd-bf65-5015dd07bd7f\" (UID: \"41c127cf-02c5-45cd-bf65-5015dd07bd7f\") " Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.151293 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41c127cf-02c5-45cd-bf65-5015dd07bd7f-config\") pod \"41c127cf-02c5-45cd-bf65-5015dd07bd7f\" (UID: \"41c127cf-02c5-45cd-bf65-5015dd07bd7f\") " Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.151495 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1748e8eb-85b8-4481-a1b4-3ce55f029c8d-client-ca\") pod \"route-controller-manager-7f6995df46-nmlhx\" (UID: \"1748e8eb-85b8-4481-a1b4-3ce55f029c8d\") " pod="openshift-route-controller-manager/route-controller-manager-7f6995df46-nmlhx" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.151562 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1748e8eb-85b8-4481-a1b4-3ce55f029c8d-config\") pod \"route-controller-manager-7f6995df46-nmlhx\" (UID: \"1748e8eb-85b8-4481-a1b4-3ce55f029c8d\") " pod="openshift-route-controller-manager/route-controller-manager-7f6995df46-nmlhx" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.151596 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqkgk\" (UniqueName: \"kubernetes.io/projected/1748e8eb-85b8-4481-a1b4-3ce55f029c8d-kube-api-access-zqkgk\") pod \"route-controller-manager-7f6995df46-nmlhx\" (UID: \"1748e8eb-85b8-4481-a1b4-3ce55f029c8d\") " pod="openshift-route-controller-manager/route-controller-manager-7f6995df46-nmlhx" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.151624 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1748e8eb-85b8-4481-a1b4-3ce55f029c8d-serving-cert\") pod \"route-controller-manager-7f6995df46-nmlhx\" (UID: \"1748e8eb-85b8-4481-a1b4-3ce55f029c8d\") " pod="openshift-route-controller-manager/route-controller-manager-7f6995df46-nmlhx" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.152320 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ccc8d88d-863a-43b2-916d-b9df5a38453d-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "ccc8d88d-863a-43b2-916d-b9df5a38453d" (UID: "ccc8d88d-863a-43b2-916d-b9df5a38453d"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.152368 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccc8d88d-863a-43b2-916d-b9df5a38453d-ready" (OuterVolumeSpecName: "ready") pod "ccc8d88d-863a-43b2-916d-b9df5a38453d" (UID: "ccc8d88d-863a-43b2-916d-b9df5a38453d"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.152565 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3962338-1df0-4e65-a1cf-d1860a48a18c-client-ca" (OuterVolumeSpecName: "client-ca") pod "d3962338-1df0-4e65-a1cf-d1860a48a18c" (UID: "d3962338-1df0-4e65-a1cf-d1860a48a18c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.152837 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccc8d88d-863a-43b2-916d-b9df5a38453d-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "ccc8d88d-863a-43b2-916d-b9df5a38453d" (UID: "ccc8d88d-863a-43b2-916d-b9df5a38453d"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.152836 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41c127cf-02c5-45cd-bf65-5015dd07bd7f-client-ca" (OuterVolumeSpecName: "client-ca") pod "41c127cf-02c5-45cd-bf65-5015dd07bd7f" (UID: "41c127cf-02c5-45cd-bf65-5015dd07bd7f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.153004 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3962338-1df0-4e65-a1cf-d1860a48a18c-config" (OuterVolumeSpecName: "config") pod "d3962338-1df0-4e65-a1cf-d1860a48a18c" (UID: "d3962338-1df0-4e65-a1cf-d1860a48a18c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.153411 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41c127cf-02c5-45cd-bf65-5015dd07bd7f-config" (OuterVolumeSpecName: "config") pod "41c127cf-02c5-45cd-bf65-5015dd07bd7f" (UID: "41c127cf-02c5-45cd-bf65-5015dd07bd7f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.155917 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3962338-1df0-4e65-a1cf-d1860a48a18c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d3962338-1df0-4e65-a1cf-d1860a48a18c" (UID: "d3962338-1df0-4e65-a1cf-d1860a48a18c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.155967 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41c127cf-02c5-45cd-bf65-5015dd07bd7f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "41c127cf-02c5-45cd-bf65-5015dd07bd7f" (UID: "41c127cf-02c5-45cd-bf65-5015dd07bd7f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.156097 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41c127cf-02c5-45cd-bf65-5015dd07bd7f-kube-api-access-5vww9" (OuterVolumeSpecName: "kube-api-access-5vww9") pod "41c127cf-02c5-45cd-bf65-5015dd07bd7f" (UID: "41c127cf-02c5-45cd-bf65-5015dd07bd7f"). InnerVolumeSpecName "kube-api-access-5vww9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.156279 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3962338-1df0-4e65-a1cf-d1860a48a18c-kube-api-access-k66lh" (OuterVolumeSpecName: "kube-api-access-k66lh") pod "d3962338-1df0-4e65-a1cf-d1860a48a18c" (UID: "d3962338-1df0-4e65-a1cf-d1860a48a18c"). InnerVolumeSpecName "kube-api-access-k66lh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.156910 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41c127cf-02c5-45cd-bf65-5015dd07bd7f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "41c127cf-02c5-45cd-bf65-5015dd07bd7f" (UID: "41c127cf-02c5-45cd-bf65-5015dd07bd7f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.158592 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccc8d88d-863a-43b2-916d-b9df5a38453d-kube-api-access-cm28w" (OuterVolumeSpecName: "kube-api-access-cm28w") pod "ccc8d88d-863a-43b2-916d-b9df5a38453d" (UID: "ccc8d88d-863a-43b2-916d-b9df5a38453d"). InnerVolumeSpecName "kube-api-access-cm28w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.191970 4965 patch_prober.go:28] interesting pod/controller-manager-768f74964f-q9448 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.44:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.192108 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-768f74964f-q9448" podUID="41c127cf-02c5-45cd-bf65-5015dd07bd7f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.44:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.252520 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqkgk\" (UniqueName: \"kubernetes.io/projected/1748e8eb-85b8-4481-a1b4-3ce55f029c8d-kube-api-access-zqkgk\") pod \"route-controller-manager-7f6995df46-nmlhx\" (UID: \"1748e8eb-85b8-4481-a1b4-3ce55f029c8d\") " pod="openshift-route-controller-manager/route-controller-manager-7f6995df46-nmlhx" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.252560 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1748e8eb-85b8-4481-a1b4-3ce55f029c8d-serving-cert\") pod \"route-controller-manager-7f6995df46-nmlhx\" (UID: \"1748e8eb-85b8-4481-a1b4-3ce55f029c8d\") " pod="openshift-route-controller-manager/route-controller-manager-7f6995df46-nmlhx" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.252607 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1748e8eb-85b8-4481-a1b4-3ce55f029c8d-client-ca\") pod \"route-controller-manager-7f6995df46-nmlhx\" (UID: \"1748e8eb-85b8-4481-a1b4-3ce55f029c8d\") " pod="openshift-route-controller-manager/route-controller-manager-7f6995df46-nmlhx" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.252647 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1748e8eb-85b8-4481-a1b4-3ce55f029c8d-config\") pod \"route-controller-manager-7f6995df46-nmlhx\" (UID: \"1748e8eb-85b8-4481-a1b4-3ce55f029c8d\") " pod="openshift-route-controller-manager/route-controller-manager-7f6995df46-nmlhx" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.252695 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vww9\" (UniqueName: \"kubernetes.io/projected/41c127cf-02c5-45cd-bf65-5015dd07bd7f-kube-api-access-5vww9\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.252706 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41c127cf-02c5-45cd-bf65-5015dd07bd7f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.252715 4965 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ccc8d88d-863a-43b2-916d-b9df5a38453d-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.252726 4965 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ccc8d88d-863a-43b2-916d-b9df5a38453d-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.252733 4965 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d3962338-1df0-4e65-a1cf-d1860a48a18c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.252741 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cm28w\" (UniqueName: \"kubernetes.io/projected/ccc8d88d-863a-43b2-916d-b9df5a38453d-kube-api-access-cm28w\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.252749 4965 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/41c127cf-02c5-45cd-bf65-5015dd07bd7f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.252758 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41c127cf-02c5-45cd-bf65-5015dd07bd7f-config\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.252766 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3962338-1df0-4e65-a1cf-d1860a48a18c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.252774 4965 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/ccc8d88d-863a-43b2-916d-b9df5a38453d-ready\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.252781 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3962338-1df0-4e65-a1cf-d1860a48a18c-config\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.252788 4965 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/41c127cf-02c5-45cd-bf65-5015dd07bd7f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.252796 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k66lh\" (UniqueName: \"kubernetes.io/projected/d3962338-1df0-4e65-a1cf-d1860a48a18c-kube-api-access-k66lh\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.253868 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1748e8eb-85b8-4481-a1b4-3ce55f029c8d-config\") pod \"route-controller-manager-7f6995df46-nmlhx\" (UID: \"1748e8eb-85b8-4481-a1b4-3ce55f029c8d\") " pod="openshift-route-controller-manager/route-controller-manager-7f6995df46-nmlhx" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.255787 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1748e8eb-85b8-4481-a1b4-3ce55f029c8d-client-ca\") pod \"route-controller-manager-7f6995df46-nmlhx\" (UID: \"1748e8eb-85b8-4481-a1b4-3ce55f029c8d\") " pod="openshift-route-controller-manager/route-controller-manager-7f6995df46-nmlhx" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.257861 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1748e8eb-85b8-4481-a1b4-3ce55f029c8d-serving-cert\") pod \"route-controller-manager-7f6995df46-nmlhx\" (UID: \"1748e8eb-85b8-4481-a1b4-3ce55f029c8d\") " pod="openshift-route-controller-manager/route-controller-manager-7f6995df46-nmlhx" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.273458 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqkgk\" (UniqueName: \"kubernetes.io/projected/1748e8eb-85b8-4481-a1b4-3ce55f029c8d-kube-api-access-zqkgk\") pod \"route-controller-manager-7f6995df46-nmlhx\" (UID: \"1748e8eb-85b8-4481-a1b4-3ce55f029c8d\") " pod="openshift-route-controller-manager/route-controller-manager-7f6995df46-nmlhx" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.367367 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f6995df46-nmlhx" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.553200 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f6995df46-nmlhx"] Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.845607 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f6995df46-nmlhx" event={"ID":"1748e8eb-85b8-4481-a1b4-3ce55f029c8d","Type":"ContainerStarted","Data":"4051179557d3822f54454683e97483230a89c4f57faeb002cc8c3ba729af666c"} Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.845916 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7f6995df46-nmlhx" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.845933 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f6995df46-nmlhx" event={"ID":"1748e8eb-85b8-4481-a1b4-3ce55f029c8d","Type":"ContainerStarted","Data":"a166456cb38bdd1f5cf16b84139ea655e97b9fd8bc500685d5d61bbc4ce69506"} Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.848560 4965 generic.go:334] "Generic (PLEG): container finished" podID="c3614d69-f3b4-4496-af2f-d119c56de1c7" containerID="c0e7dc80203cb3df3f3ced8cca96f483e671b379bb247bd04500d14affa27860" exitCode=0 Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.848628 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjkft" event={"ID":"c3614d69-f3b4-4496-af2f-d119c56de1c7","Type":"ContainerDied","Data":"c0e7dc80203cb3df3f3ced8cca96f483e671b379bb247bd04500d14affa27860"} Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.851059 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-5scj2_ccc8d88d-863a-43b2-916d-b9df5a38453d/kube-multus-additional-cni-plugins/0.log" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.851194 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-5scj2" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.851782 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-5scj2" event={"ID":"ccc8d88d-863a-43b2-916d-b9df5a38453d","Type":"ContainerDied","Data":"b4c6d0c88c9db38dc49b013fde4d006b99a97e8b220d1823a382856641390973"} Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.851844 4965 scope.go:117] "RemoveContainer" containerID="3f72bea0b0230bd5d6e3e34312b81820002f622700a1b0b20f09e43e4a37e2ad" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.858527 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-768f74964f-q9448" event={"ID":"41c127cf-02c5-45cd-bf65-5015dd07bd7f","Type":"ContainerDied","Data":"2be71dd524371e104a3651febd59472497b678a257d576c45bdfb459f2752252"} Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.858630 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-768f74964f-q9448" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.866684 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7f6995df46-nmlhx" podStartSLOduration=15.866669233 podStartE2EDuration="15.866669233s" podCreationTimestamp="2026-03-18 11:58:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:59:03.865386536 +0000 UTC m=+148.851574015" watchObservedRunningTime="2026-03-18 11:59:03.866669233 +0000 UTC m=+148.852856712" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.866910 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-774bccfc87-tv2nq" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.868986 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-774bccfc87-tv2nq" event={"ID":"d3962338-1df0-4e65-a1cf-d1860a48a18c","Type":"ContainerDied","Data":"0ea105214428f812f830fc40b869c1f8f6ba81d3d9bc75902374afb7f29bb1cb"} Mar 18 11:59:03 crc kubenswrapper[4965]: E0318 11:59:03.869129 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-gwpmg" podUID="f76b91e1-f768-4cb1-857d-6e3eb31e59f6" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.878789 4965 scope.go:117] "RemoveContainer" containerID="234b60652eb06cf5c71cd97027084cab71bc15881eb956b6ffa36efe073190b3" Mar 18 11:59:03 crc kubenswrapper[4965]: E0318 11:59:03.878862 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-rkz24" podUID="7a889591-6c1d-4940-9eb2-28cfa6988f89" Mar 18 11:59:03 crc kubenswrapper[4965]: E0318 11:59:03.878791 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-bq6m2" podUID="6c7d53d7-b4fc-48d5-b8f4-545ea972697e" Mar 18 11:59:03 crc kubenswrapper[4965]: E0318 11:59:03.878951 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-5wjlx" podUID="39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5" Mar 18 11:59:03 crc kubenswrapper[4965]: E0318 11:59:03.880941 4965 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-6pk8b" podUID="4329347a-487b-4012-a715-9565bc4e67d0" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.908627 4965 scope.go:117] "RemoveContainer" containerID="138559671c5a520589e780f12dae9c01c06c8b519221970cf7ac3c7e4e6df452" Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.913811 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-768f74964f-q9448"] Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.916507 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-768f74964f-q9448"] Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.965708 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-5scj2"] Mar 18 11:59:03 crc kubenswrapper[4965]: I0318 11:59:03.969370 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-5scj2"] Mar 18 11:59:04 crc kubenswrapper[4965]: I0318 11:59:04.026149 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41c127cf-02c5-45cd-bf65-5015dd07bd7f" path="/var/lib/kubelet/pods/41c127cf-02c5-45cd-bf65-5015dd07bd7f/volumes" Mar 18 11:59:04 crc kubenswrapper[4965]: I0318 11:59:04.026632 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccc8d88d-863a-43b2-916d-b9df5a38453d" path="/var/lib/kubelet/pods/ccc8d88d-863a-43b2-916d-b9df5a38453d/volumes" Mar 18 11:59:04 crc kubenswrapper[4965]: I0318 11:59:04.060998 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-774bccfc87-tv2nq"] Mar 18 11:59:04 crc kubenswrapper[4965]: I0318 11:59:04.067277 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-774bccfc87-tv2nq"] Mar 18 11:59:04 crc kubenswrapper[4965]: I0318 11:59:04.200324 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7f6995df46-nmlhx" Mar 18 11:59:04 crc kubenswrapper[4965]: I0318 11:59:04.874758 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjkft" event={"ID":"c3614d69-f3b4-4496-af2f-d119c56de1c7","Type":"ContainerStarted","Data":"96cb4ed8007f5521ad25fbc1339a01105adfe6c8ebd54aacbbf6a7af040d6ae5"} Mar 18 11:59:04 crc kubenswrapper[4965]: I0318 11:59:04.895998 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sjkft" podStartSLOduration=2.720962472 podStartE2EDuration="32.895978095s" podCreationTimestamp="2026-03-18 11:58:32 +0000 UTC" firstStartedPulling="2026-03-18 11:58:34.322769541 +0000 UTC m=+119.308957020" lastFinishedPulling="2026-03-18 11:59:04.497785164 +0000 UTC m=+149.483972643" observedRunningTime="2026-03-18 11:59:04.892151883 +0000 UTC m=+149.878339372" watchObservedRunningTime="2026-03-18 11:59:04.895978095 +0000 UTC m=+149.882165574" Mar 18 11:59:05 crc kubenswrapper[4965]: I0318 11:59:05.266129 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-68689d9bbb-zzqp5"] Mar 18 11:59:05 crc kubenswrapper[4965]: I0318 11:59:05.266973 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-68689d9bbb-zzqp5" Mar 18 11:59:05 crc kubenswrapper[4965]: I0318 11:59:05.269434 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 11:59:05 crc kubenswrapper[4965]: I0318 11:59:05.269774 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 11:59:05 crc kubenswrapper[4965]: I0318 11:59:05.270119 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 11:59:05 crc kubenswrapper[4965]: I0318 11:59:05.271741 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 11:59:05 crc kubenswrapper[4965]: I0318 11:59:05.271783 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 11:59:05 crc kubenswrapper[4965]: I0318 11:59:05.271975 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 11:59:05 crc kubenswrapper[4965]: I0318 11:59:05.276374 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-68689d9bbb-zzqp5"] Mar 18 11:59:05 crc kubenswrapper[4965]: I0318 11:59:05.278726 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 11:59:05 crc kubenswrapper[4965]: I0318 11:59:05.396586 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/19d94bd2-7927-466e-a14a-b2f25275b653-client-ca\") pod \"controller-manager-68689d9bbb-zzqp5\" (UID: \"19d94bd2-7927-466e-a14a-b2f25275b653\") " pod="openshift-controller-manager/controller-manager-68689d9bbb-zzqp5" Mar 18 11:59:05 crc kubenswrapper[4965]: I0318 11:59:05.396701 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvhv7\" (UniqueName: \"kubernetes.io/projected/19d94bd2-7927-466e-a14a-b2f25275b653-kube-api-access-fvhv7\") pod \"controller-manager-68689d9bbb-zzqp5\" (UID: \"19d94bd2-7927-466e-a14a-b2f25275b653\") " pod="openshift-controller-manager/controller-manager-68689d9bbb-zzqp5" Mar 18 11:59:05 crc kubenswrapper[4965]: I0318 11:59:05.396756 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/19d94bd2-7927-466e-a14a-b2f25275b653-proxy-ca-bundles\") pod \"controller-manager-68689d9bbb-zzqp5\" (UID: \"19d94bd2-7927-466e-a14a-b2f25275b653\") " pod="openshift-controller-manager/controller-manager-68689d9bbb-zzqp5" Mar 18 11:59:05 crc kubenswrapper[4965]: I0318 11:59:05.396791 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19d94bd2-7927-466e-a14a-b2f25275b653-serving-cert\") pod \"controller-manager-68689d9bbb-zzqp5\" (UID: \"19d94bd2-7927-466e-a14a-b2f25275b653\") " pod="openshift-controller-manager/controller-manager-68689d9bbb-zzqp5" Mar 18 11:59:05 crc kubenswrapper[4965]: I0318 11:59:05.396811 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19d94bd2-7927-466e-a14a-b2f25275b653-config\") pod \"controller-manager-68689d9bbb-zzqp5\" (UID: \"19d94bd2-7927-466e-a14a-b2f25275b653\") " pod="openshift-controller-manager/controller-manager-68689d9bbb-zzqp5" Mar 18 11:59:05 crc kubenswrapper[4965]: I0318 11:59:05.498429 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19d94bd2-7927-466e-a14a-b2f25275b653-serving-cert\") pod \"controller-manager-68689d9bbb-zzqp5\" (UID: \"19d94bd2-7927-466e-a14a-b2f25275b653\") " pod="openshift-controller-manager/controller-manager-68689d9bbb-zzqp5" Mar 18 11:59:05 crc kubenswrapper[4965]: I0318 11:59:05.498765 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19d94bd2-7927-466e-a14a-b2f25275b653-config\") pod \"controller-manager-68689d9bbb-zzqp5\" (UID: \"19d94bd2-7927-466e-a14a-b2f25275b653\") " pod="openshift-controller-manager/controller-manager-68689d9bbb-zzqp5" Mar 18 11:59:05 crc kubenswrapper[4965]: I0318 11:59:05.498828 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/19d94bd2-7927-466e-a14a-b2f25275b653-client-ca\") pod \"controller-manager-68689d9bbb-zzqp5\" (UID: \"19d94bd2-7927-466e-a14a-b2f25275b653\") " pod="openshift-controller-manager/controller-manager-68689d9bbb-zzqp5" Mar 18 11:59:05 crc kubenswrapper[4965]: I0318 11:59:05.498856 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvhv7\" (UniqueName: \"kubernetes.io/projected/19d94bd2-7927-466e-a14a-b2f25275b653-kube-api-access-fvhv7\") pod \"controller-manager-68689d9bbb-zzqp5\" (UID: \"19d94bd2-7927-466e-a14a-b2f25275b653\") " pod="openshift-controller-manager/controller-manager-68689d9bbb-zzqp5" Mar 18 11:59:05 crc kubenswrapper[4965]: I0318 11:59:05.498888 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/19d94bd2-7927-466e-a14a-b2f25275b653-proxy-ca-bundles\") pod \"controller-manager-68689d9bbb-zzqp5\" (UID: \"19d94bd2-7927-466e-a14a-b2f25275b653\") " pod="openshift-controller-manager/controller-manager-68689d9bbb-zzqp5" Mar 18 11:59:05 crc kubenswrapper[4965]: I0318 11:59:05.499997 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19d94bd2-7927-466e-a14a-b2f25275b653-config\") pod \"controller-manager-68689d9bbb-zzqp5\" (UID: \"19d94bd2-7927-466e-a14a-b2f25275b653\") " pod="openshift-controller-manager/controller-manager-68689d9bbb-zzqp5" Mar 18 11:59:05 crc kubenswrapper[4965]: I0318 11:59:05.500147 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/19d94bd2-7927-466e-a14a-b2f25275b653-proxy-ca-bundles\") pod \"controller-manager-68689d9bbb-zzqp5\" (UID: \"19d94bd2-7927-466e-a14a-b2f25275b653\") " pod="openshift-controller-manager/controller-manager-68689d9bbb-zzqp5" Mar 18 11:59:05 crc kubenswrapper[4965]: I0318 11:59:05.501581 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/19d94bd2-7927-466e-a14a-b2f25275b653-client-ca\") pod \"controller-manager-68689d9bbb-zzqp5\" (UID: \"19d94bd2-7927-466e-a14a-b2f25275b653\") " pod="openshift-controller-manager/controller-manager-68689d9bbb-zzqp5" Mar 18 11:59:05 crc kubenswrapper[4965]: I0318 11:59:05.509493 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19d94bd2-7927-466e-a14a-b2f25275b653-serving-cert\") pod \"controller-manager-68689d9bbb-zzqp5\" (UID: \"19d94bd2-7927-466e-a14a-b2f25275b653\") " pod="openshift-controller-manager/controller-manager-68689d9bbb-zzqp5" Mar 18 11:59:05 crc kubenswrapper[4965]: I0318 11:59:05.529201 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvhv7\" (UniqueName: \"kubernetes.io/projected/19d94bd2-7927-466e-a14a-b2f25275b653-kube-api-access-fvhv7\") pod \"controller-manager-68689d9bbb-zzqp5\" (UID: \"19d94bd2-7927-466e-a14a-b2f25275b653\") " pod="openshift-controller-manager/controller-manager-68689d9bbb-zzqp5" Mar 18 11:59:05 crc kubenswrapper[4965]: I0318 11:59:05.616982 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-68689d9bbb-zzqp5" Mar 18 11:59:05 crc kubenswrapper[4965]: I0318 11:59:05.828527 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-68689d9bbb-zzqp5"] Mar 18 11:59:05 crc kubenswrapper[4965]: I0318 11:59:05.884495 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-68689d9bbb-zzqp5" event={"ID":"19d94bd2-7927-466e-a14a-b2f25275b653","Type":"ContainerStarted","Data":"ab01532fac80efe9040829e2a53b3e7defa225a746d27fcf71dfe8cf382745ef"} Mar 18 11:59:06 crc kubenswrapper[4965]: I0318 11:59:06.040984 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3962338-1df0-4e65-a1cf-d1860a48a18c" path="/var/lib/kubelet/pods/d3962338-1df0-4e65-a1cf-d1860a48a18c/volumes" Mar 18 11:59:06 crc kubenswrapper[4965]: I0318 11:59:06.388419 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bksfk" Mar 18 11:59:06 crc kubenswrapper[4965]: I0318 11:59:06.892867 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-68689d9bbb-zzqp5" event={"ID":"19d94bd2-7927-466e-a14a-b2f25275b653","Type":"ContainerStarted","Data":"d69dc398c60727beaac2d8e951903b33dee90ad241f548d3e89cfef79cfe5649"} Mar 18 11:59:06 crc kubenswrapper[4965]: I0318 11:59:06.893430 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-68689d9bbb-zzqp5" Mar 18 11:59:06 crc kubenswrapper[4965]: I0318 11:59:06.897817 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-68689d9bbb-zzqp5" Mar 18 11:59:06 crc kubenswrapper[4965]: I0318 11:59:06.915394 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-68689d9bbb-zzqp5" podStartSLOduration=18.915358686 podStartE2EDuration="18.915358686s" podCreationTimestamp="2026-03-18 11:58:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:59:06.910991499 +0000 UTC m=+151.897178978" watchObservedRunningTime="2026-03-18 11:59:06.915358686 +0000 UTC m=+151.901546175" Mar 18 11:59:08 crc kubenswrapper[4965]: I0318 11:59:08.175759 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-68689d9bbb-zzqp5"] Mar 18 11:59:08 crc kubenswrapper[4965]: I0318 11:59:08.260923 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f6995df46-nmlhx"] Mar 18 11:59:08 crc kubenswrapper[4965]: I0318 11:59:08.261147 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7f6995df46-nmlhx" podUID="1748e8eb-85b8-4481-a1b4-3ce55f029c8d" containerName="route-controller-manager" containerID="cri-o://4051179557d3822f54454683e97483230a89c4f57faeb002cc8c3ba729af666c" gracePeriod=30 Mar 18 11:59:08 crc kubenswrapper[4965]: I0318 11:59:08.627954 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f6995df46-nmlhx" Mar 18 11:59:08 crc kubenswrapper[4965]: I0318 11:59:08.665326 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1748e8eb-85b8-4481-a1b4-3ce55f029c8d-client-ca\") pod \"1748e8eb-85b8-4481-a1b4-3ce55f029c8d\" (UID: \"1748e8eb-85b8-4481-a1b4-3ce55f029c8d\") " Mar 18 11:59:08 crc kubenswrapper[4965]: I0318 11:59:08.665464 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqkgk\" (UniqueName: \"kubernetes.io/projected/1748e8eb-85b8-4481-a1b4-3ce55f029c8d-kube-api-access-zqkgk\") pod \"1748e8eb-85b8-4481-a1b4-3ce55f029c8d\" (UID: \"1748e8eb-85b8-4481-a1b4-3ce55f029c8d\") " Mar 18 11:59:08 crc kubenswrapper[4965]: I0318 11:59:08.665550 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1748e8eb-85b8-4481-a1b4-3ce55f029c8d-config\") pod \"1748e8eb-85b8-4481-a1b4-3ce55f029c8d\" (UID: \"1748e8eb-85b8-4481-a1b4-3ce55f029c8d\") " Mar 18 11:59:08 crc kubenswrapper[4965]: I0318 11:59:08.665583 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1748e8eb-85b8-4481-a1b4-3ce55f029c8d-serving-cert\") pod \"1748e8eb-85b8-4481-a1b4-3ce55f029c8d\" (UID: \"1748e8eb-85b8-4481-a1b4-3ce55f029c8d\") " Mar 18 11:59:08 crc kubenswrapper[4965]: I0318 11:59:08.666409 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1748e8eb-85b8-4481-a1b4-3ce55f029c8d-client-ca" (OuterVolumeSpecName: "client-ca") pod "1748e8eb-85b8-4481-a1b4-3ce55f029c8d" (UID: "1748e8eb-85b8-4481-a1b4-3ce55f029c8d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:59:08 crc kubenswrapper[4965]: I0318 11:59:08.667038 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1748e8eb-85b8-4481-a1b4-3ce55f029c8d-config" (OuterVolumeSpecName: "config") pod "1748e8eb-85b8-4481-a1b4-3ce55f029c8d" (UID: "1748e8eb-85b8-4481-a1b4-3ce55f029c8d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:59:08 crc kubenswrapper[4965]: I0318 11:59:08.671810 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1748e8eb-85b8-4481-a1b4-3ce55f029c8d-kube-api-access-zqkgk" (OuterVolumeSpecName: "kube-api-access-zqkgk") pod "1748e8eb-85b8-4481-a1b4-3ce55f029c8d" (UID: "1748e8eb-85b8-4481-a1b4-3ce55f029c8d"). InnerVolumeSpecName "kube-api-access-zqkgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:59:08 crc kubenswrapper[4965]: I0318 11:59:08.676801 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1748e8eb-85b8-4481-a1b4-3ce55f029c8d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1748e8eb-85b8-4481-a1b4-3ce55f029c8d" (UID: "1748e8eb-85b8-4481-a1b4-3ce55f029c8d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:59:08 crc kubenswrapper[4965]: I0318 11:59:08.767099 4965 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1748e8eb-85b8-4481-a1b4-3ce55f029c8d-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:08 crc kubenswrapper[4965]: I0318 11:59:08.767144 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqkgk\" (UniqueName: \"kubernetes.io/projected/1748e8eb-85b8-4481-a1b4-3ce55f029c8d-kube-api-access-zqkgk\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:08 crc kubenswrapper[4965]: I0318 11:59:08.767161 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1748e8eb-85b8-4481-a1b4-3ce55f029c8d-config\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:08 crc kubenswrapper[4965]: I0318 11:59:08.767172 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1748e8eb-85b8-4481-a1b4-3ce55f029c8d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:08 crc kubenswrapper[4965]: I0318 11:59:08.793438 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-sk66r"] Mar 18 11:59:08 crc kubenswrapper[4965]: I0318 11:59:08.902949 4965 generic.go:334] "Generic (PLEG): container finished" podID="1748e8eb-85b8-4481-a1b4-3ce55f029c8d" containerID="4051179557d3822f54454683e97483230a89c4f57faeb002cc8c3ba729af666c" exitCode=0 Mar 18 11:59:08 crc kubenswrapper[4965]: I0318 11:59:08.903795 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f6995df46-nmlhx" Mar 18 11:59:08 crc kubenswrapper[4965]: I0318 11:59:08.908771 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f6995df46-nmlhx" event={"ID":"1748e8eb-85b8-4481-a1b4-3ce55f029c8d","Type":"ContainerDied","Data":"4051179557d3822f54454683e97483230a89c4f57faeb002cc8c3ba729af666c"} Mar 18 11:59:08 crc kubenswrapper[4965]: I0318 11:59:08.908847 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f6995df46-nmlhx" event={"ID":"1748e8eb-85b8-4481-a1b4-3ce55f029c8d","Type":"ContainerDied","Data":"a166456cb38bdd1f5cf16b84139ea655e97b9fd8bc500685d5d61bbc4ce69506"} Mar 18 11:59:08 crc kubenswrapper[4965]: I0318 11:59:08.908872 4965 scope.go:117] "RemoveContainer" containerID="4051179557d3822f54454683e97483230a89c4f57faeb002cc8c3ba729af666c" Mar 18 11:59:08 crc kubenswrapper[4965]: I0318 11:59:08.934099 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f6995df46-nmlhx"] Mar 18 11:59:08 crc kubenswrapper[4965]: I0318 11:59:08.934243 4965 scope.go:117] "RemoveContainer" containerID="4051179557d3822f54454683e97483230a89c4f57faeb002cc8c3ba729af666c" Mar 18 11:59:08 crc kubenswrapper[4965]: E0318 11:59:08.934642 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4051179557d3822f54454683e97483230a89c4f57faeb002cc8c3ba729af666c\": container with ID starting with 4051179557d3822f54454683e97483230a89c4f57faeb002cc8c3ba729af666c not found: ID does not exist" containerID="4051179557d3822f54454683e97483230a89c4f57faeb002cc8c3ba729af666c" Mar 18 11:59:08 crc kubenswrapper[4965]: I0318 11:59:08.934694 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4051179557d3822f54454683e97483230a89c4f57faeb002cc8c3ba729af666c"} err="failed to get container status \"4051179557d3822f54454683e97483230a89c4f57faeb002cc8c3ba729af666c\": rpc error: code = NotFound desc = could not find container \"4051179557d3822f54454683e97483230a89c4f57faeb002cc8c3ba729af666c\": container with ID starting with 4051179557d3822f54454683e97483230a89c4f57faeb002cc8c3ba729af666c not found: ID does not exist" Mar 18 11:59:08 crc kubenswrapper[4965]: I0318 11:59:08.937305 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f6995df46-nmlhx"] Mar 18 11:59:09 crc kubenswrapper[4965]: I0318 11:59:09.909172 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-68689d9bbb-zzqp5" podUID="19d94bd2-7927-466e-a14a-b2f25275b653" containerName="controller-manager" containerID="cri-o://d69dc398c60727beaac2d8e951903b33dee90ad241f548d3e89cfef79cfe5649" gracePeriod=30 Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.026767 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1748e8eb-85b8-4481-a1b4-3ce55f029c8d" path="/var/lib/kubelet/pods/1748e8eb-85b8-4481-a1b4-3ce55f029c8d/volumes" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.047264 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 11:59:10 crc kubenswrapper[4965]: E0318 11:59:10.047520 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1748e8eb-85b8-4481-a1b4-3ce55f029c8d" containerName="route-controller-manager" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.047534 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="1748e8eb-85b8-4481-a1b4-3ce55f029c8d" containerName="route-controller-manager" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.047691 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="1748e8eb-85b8-4481-a1b4-3ce55f029c8d" containerName="route-controller-manager" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.048030 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.052555 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.052850 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.060580 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.087607 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7ada1365-a4aa-4a56-8fc6-e71aa7933bc8-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7ada1365-a4aa-4a56-8fc6-e71aa7933bc8\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.087720 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7ada1365-a4aa-4a56-8fc6-e71aa7933bc8-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7ada1365-a4aa-4a56-8fc6-e71aa7933bc8\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.188973 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7ada1365-a4aa-4a56-8fc6-e71aa7933bc8-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7ada1365-a4aa-4a56-8fc6-e71aa7933bc8\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.189044 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7ada1365-a4aa-4a56-8fc6-e71aa7933bc8-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7ada1365-a4aa-4a56-8fc6-e71aa7933bc8\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.189129 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7ada1365-a4aa-4a56-8fc6-e71aa7933bc8-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7ada1365-a4aa-4a56-8fc6-e71aa7933bc8\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.210909 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7ada1365-a4aa-4a56-8fc6-e71aa7933bc8-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7ada1365-a4aa-4a56-8fc6-e71aa7933bc8\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.274204 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67944cbc48-46jc2"] Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.274814 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67944cbc48-46jc2" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.277905 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.278673 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.279124 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.279324 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.279509 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.280088 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.281170 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67944cbc48-46jc2"] Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.290566 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-276ms\" (UniqueName: \"kubernetes.io/projected/e37096ec-855c-47b4-8815-6d54a2b95c50-kube-api-access-276ms\") pod \"route-controller-manager-67944cbc48-46jc2\" (UID: \"e37096ec-855c-47b4-8815-6d54a2b95c50\") " pod="openshift-route-controller-manager/route-controller-manager-67944cbc48-46jc2" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.290597 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e37096ec-855c-47b4-8815-6d54a2b95c50-client-ca\") pod \"route-controller-manager-67944cbc48-46jc2\" (UID: \"e37096ec-855c-47b4-8815-6d54a2b95c50\") " pod="openshift-route-controller-manager/route-controller-manager-67944cbc48-46jc2" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.290615 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e37096ec-855c-47b4-8815-6d54a2b95c50-serving-cert\") pod \"route-controller-manager-67944cbc48-46jc2\" (UID: \"e37096ec-855c-47b4-8815-6d54a2b95c50\") " pod="openshift-route-controller-manager/route-controller-manager-67944cbc48-46jc2" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.290707 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e37096ec-855c-47b4-8815-6d54a2b95c50-config\") pod \"route-controller-manager-67944cbc48-46jc2\" (UID: \"e37096ec-855c-47b4-8815-6d54a2b95c50\") " pod="openshift-route-controller-manager/route-controller-manager-67944cbc48-46jc2" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.373909 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.389622 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-68689d9bbb-zzqp5" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.392231 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e37096ec-855c-47b4-8815-6d54a2b95c50-config\") pod \"route-controller-manager-67944cbc48-46jc2\" (UID: \"e37096ec-855c-47b4-8815-6d54a2b95c50\") " pod="openshift-route-controller-manager/route-controller-manager-67944cbc48-46jc2" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.392308 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-276ms\" (UniqueName: \"kubernetes.io/projected/e37096ec-855c-47b4-8815-6d54a2b95c50-kube-api-access-276ms\") pod \"route-controller-manager-67944cbc48-46jc2\" (UID: \"e37096ec-855c-47b4-8815-6d54a2b95c50\") " pod="openshift-route-controller-manager/route-controller-manager-67944cbc48-46jc2" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.392346 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e37096ec-855c-47b4-8815-6d54a2b95c50-client-ca\") pod \"route-controller-manager-67944cbc48-46jc2\" (UID: \"e37096ec-855c-47b4-8815-6d54a2b95c50\") " pod="openshift-route-controller-manager/route-controller-manager-67944cbc48-46jc2" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.392366 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e37096ec-855c-47b4-8815-6d54a2b95c50-serving-cert\") pod \"route-controller-manager-67944cbc48-46jc2\" (UID: \"e37096ec-855c-47b4-8815-6d54a2b95c50\") " pod="openshift-route-controller-manager/route-controller-manager-67944cbc48-46jc2" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.393223 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e37096ec-855c-47b4-8815-6d54a2b95c50-client-ca\") pod \"route-controller-manager-67944cbc48-46jc2\" (UID: \"e37096ec-855c-47b4-8815-6d54a2b95c50\") " pod="openshift-route-controller-manager/route-controller-manager-67944cbc48-46jc2" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.393277 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e37096ec-855c-47b4-8815-6d54a2b95c50-config\") pod \"route-controller-manager-67944cbc48-46jc2\" (UID: \"e37096ec-855c-47b4-8815-6d54a2b95c50\") " pod="openshift-route-controller-manager/route-controller-manager-67944cbc48-46jc2" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.400308 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e37096ec-855c-47b4-8815-6d54a2b95c50-serving-cert\") pod \"route-controller-manager-67944cbc48-46jc2\" (UID: \"e37096ec-855c-47b4-8815-6d54a2b95c50\") " pod="openshift-route-controller-manager/route-controller-manager-67944cbc48-46jc2" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.413501 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-276ms\" (UniqueName: \"kubernetes.io/projected/e37096ec-855c-47b4-8815-6d54a2b95c50-kube-api-access-276ms\") pod \"route-controller-manager-67944cbc48-46jc2\" (UID: \"e37096ec-855c-47b4-8815-6d54a2b95c50\") " pod="openshift-route-controller-manager/route-controller-manager-67944cbc48-46jc2" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.494072 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19d94bd2-7927-466e-a14a-b2f25275b653-config\") pod \"19d94bd2-7927-466e-a14a-b2f25275b653\" (UID: \"19d94bd2-7927-466e-a14a-b2f25275b653\") " Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.494129 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/19d94bd2-7927-466e-a14a-b2f25275b653-proxy-ca-bundles\") pod \"19d94bd2-7927-466e-a14a-b2f25275b653\" (UID: \"19d94bd2-7927-466e-a14a-b2f25275b653\") " Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.494205 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/19d94bd2-7927-466e-a14a-b2f25275b653-client-ca\") pod \"19d94bd2-7927-466e-a14a-b2f25275b653\" (UID: \"19d94bd2-7927-466e-a14a-b2f25275b653\") " Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.494293 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvhv7\" (UniqueName: \"kubernetes.io/projected/19d94bd2-7927-466e-a14a-b2f25275b653-kube-api-access-fvhv7\") pod \"19d94bd2-7927-466e-a14a-b2f25275b653\" (UID: \"19d94bd2-7927-466e-a14a-b2f25275b653\") " Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.494336 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19d94bd2-7927-466e-a14a-b2f25275b653-serving-cert\") pod \"19d94bd2-7927-466e-a14a-b2f25275b653\" (UID: \"19d94bd2-7927-466e-a14a-b2f25275b653\") " Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.495788 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19d94bd2-7927-466e-a14a-b2f25275b653-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "19d94bd2-7927-466e-a14a-b2f25275b653" (UID: "19d94bd2-7927-466e-a14a-b2f25275b653"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.495824 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19d94bd2-7927-466e-a14a-b2f25275b653-client-ca" (OuterVolumeSpecName: "client-ca") pod "19d94bd2-7927-466e-a14a-b2f25275b653" (UID: "19d94bd2-7927-466e-a14a-b2f25275b653"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.495955 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19d94bd2-7927-466e-a14a-b2f25275b653-config" (OuterVolumeSpecName: "config") pod "19d94bd2-7927-466e-a14a-b2f25275b653" (UID: "19d94bd2-7927-466e-a14a-b2f25275b653"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.497646 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19d94bd2-7927-466e-a14a-b2f25275b653-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "19d94bd2-7927-466e-a14a-b2f25275b653" (UID: "19d94bd2-7927-466e-a14a-b2f25275b653"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.497951 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19d94bd2-7927-466e-a14a-b2f25275b653-kube-api-access-fvhv7" (OuterVolumeSpecName: "kube-api-access-fvhv7") pod "19d94bd2-7927-466e-a14a-b2f25275b653" (UID: "19d94bd2-7927-466e-a14a-b2f25275b653"). InnerVolumeSpecName "kube-api-access-fvhv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.595865 4965 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/19d94bd2-7927-466e-a14a-b2f25275b653-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.595895 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvhv7\" (UniqueName: \"kubernetes.io/projected/19d94bd2-7927-466e-a14a-b2f25275b653-kube-api-access-fvhv7\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.595905 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19d94bd2-7927-466e-a14a-b2f25275b653-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.595914 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19d94bd2-7927-466e-a14a-b2f25275b653-config\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.595923 4965 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/19d94bd2-7927-466e-a14a-b2f25275b653-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.599616 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67944cbc48-46jc2" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.789895 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.917893 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"7ada1365-a4aa-4a56-8fc6-e71aa7933bc8","Type":"ContainerStarted","Data":"fad869a2532c0519aa2dc58f326c791caa5333d034d463b70867ad1bc3d1144d"} Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.919727 4965 generic.go:334] "Generic (PLEG): container finished" podID="19d94bd2-7927-466e-a14a-b2f25275b653" containerID="d69dc398c60727beaac2d8e951903b33dee90ad241f548d3e89cfef79cfe5649" exitCode=0 Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.919775 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-68689d9bbb-zzqp5" event={"ID":"19d94bd2-7927-466e-a14a-b2f25275b653","Type":"ContainerDied","Data":"d69dc398c60727beaac2d8e951903b33dee90ad241f548d3e89cfef79cfe5649"} Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.919794 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-68689d9bbb-zzqp5" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.919806 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-68689d9bbb-zzqp5" event={"ID":"19d94bd2-7927-466e-a14a-b2f25275b653","Type":"ContainerDied","Data":"ab01532fac80efe9040829e2a53b3e7defa225a746d27fcf71dfe8cf382745ef"} Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.919830 4965 scope.go:117] "RemoveContainer" containerID="d69dc398c60727beaac2d8e951903b33dee90ad241f548d3e89cfef79cfe5649" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.941518 4965 scope.go:117] "RemoveContainer" containerID="d69dc398c60727beaac2d8e951903b33dee90ad241f548d3e89cfef79cfe5649" Mar 18 11:59:10 crc kubenswrapper[4965]: E0318 11:59:10.942013 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d69dc398c60727beaac2d8e951903b33dee90ad241f548d3e89cfef79cfe5649\": container with ID starting with d69dc398c60727beaac2d8e951903b33dee90ad241f548d3e89cfef79cfe5649 not found: ID does not exist" containerID="d69dc398c60727beaac2d8e951903b33dee90ad241f548d3e89cfef79cfe5649" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.942045 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d69dc398c60727beaac2d8e951903b33dee90ad241f548d3e89cfef79cfe5649"} err="failed to get container status \"d69dc398c60727beaac2d8e951903b33dee90ad241f548d3e89cfef79cfe5649\": rpc error: code = NotFound desc = could not find container \"d69dc398c60727beaac2d8e951903b33dee90ad241f548d3e89cfef79cfe5649\": container with ID starting with d69dc398c60727beaac2d8e951903b33dee90ad241f548d3e89cfef79cfe5649 not found: ID does not exist" Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.956515 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-68689d9bbb-zzqp5"] Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.966496 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-68689d9bbb-zzqp5"] Mar 18 11:59:10 crc kubenswrapper[4965]: I0318 11:59:10.980526 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67944cbc48-46jc2"] Mar 18 11:59:10 crc kubenswrapper[4965]: W0318 11:59:10.986870 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode37096ec_855c_47b4_8815_6d54a2b95c50.slice/crio-9aa87e61365b541a4fef414f88d37b0291e353b2512ea1a23e238225626b4735 WatchSource:0}: Error finding container 9aa87e61365b541a4fef414f88d37b0291e353b2512ea1a23e238225626b4735: Status 404 returned error can't find the container with id 9aa87e61365b541a4fef414f88d37b0291e353b2512ea1a23e238225626b4735 Mar 18 11:59:11 crc kubenswrapper[4965]: I0318 11:59:11.273267 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6c59d97bd4-8kwdx"] Mar 18 11:59:11 crc kubenswrapper[4965]: E0318 11:59:11.273522 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19d94bd2-7927-466e-a14a-b2f25275b653" containerName="controller-manager" Mar 18 11:59:11 crc kubenswrapper[4965]: I0318 11:59:11.273535 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="19d94bd2-7927-466e-a14a-b2f25275b653" containerName="controller-manager" Mar 18 11:59:11 crc kubenswrapper[4965]: I0318 11:59:11.273673 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="19d94bd2-7927-466e-a14a-b2f25275b653" containerName="controller-manager" Mar 18 11:59:11 crc kubenswrapper[4965]: I0318 11:59:11.274226 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c59d97bd4-8kwdx" Mar 18 11:59:11 crc kubenswrapper[4965]: I0318 11:59:11.276977 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 11:59:11 crc kubenswrapper[4965]: I0318 11:59:11.277041 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 11:59:11 crc kubenswrapper[4965]: I0318 11:59:11.277287 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 11:59:11 crc kubenswrapper[4965]: I0318 11:59:11.278448 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 11:59:11 crc kubenswrapper[4965]: I0318 11:59:11.279636 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 11:59:11 crc kubenswrapper[4965]: I0318 11:59:11.280136 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 11:59:11 crc kubenswrapper[4965]: I0318 11:59:11.288690 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 11:59:11 crc kubenswrapper[4965]: I0318 11:59:11.290631 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c59d97bd4-8kwdx"] Mar 18 11:59:11 crc kubenswrapper[4965]: I0318 11:59:11.303500 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa2dea9b-4dc5-4223-9c11-13130a44c7d4-serving-cert\") pod \"controller-manager-6c59d97bd4-8kwdx\" (UID: \"aa2dea9b-4dc5-4223-9c11-13130a44c7d4\") " pod="openshift-controller-manager/controller-manager-6c59d97bd4-8kwdx" Mar 18 11:59:11 crc kubenswrapper[4965]: I0318 11:59:11.303544 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aa2dea9b-4dc5-4223-9c11-13130a44c7d4-proxy-ca-bundles\") pod \"controller-manager-6c59d97bd4-8kwdx\" (UID: \"aa2dea9b-4dc5-4223-9c11-13130a44c7d4\") " pod="openshift-controller-manager/controller-manager-6c59d97bd4-8kwdx" Mar 18 11:59:11 crc kubenswrapper[4965]: I0318 11:59:11.303578 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aa2dea9b-4dc5-4223-9c11-13130a44c7d4-client-ca\") pod \"controller-manager-6c59d97bd4-8kwdx\" (UID: \"aa2dea9b-4dc5-4223-9c11-13130a44c7d4\") " pod="openshift-controller-manager/controller-manager-6c59d97bd4-8kwdx" Mar 18 11:59:11 crc kubenswrapper[4965]: I0318 11:59:11.303630 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa2dea9b-4dc5-4223-9c11-13130a44c7d4-config\") pod \"controller-manager-6c59d97bd4-8kwdx\" (UID: \"aa2dea9b-4dc5-4223-9c11-13130a44c7d4\") " pod="openshift-controller-manager/controller-manager-6c59d97bd4-8kwdx" Mar 18 11:59:11 crc kubenswrapper[4965]: I0318 11:59:11.303651 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dlgv\" (UniqueName: \"kubernetes.io/projected/aa2dea9b-4dc5-4223-9c11-13130a44c7d4-kube-api-access-6dlgv\") pod \"controller-manager-6c59d97bd4-8kwdx\" (UID: \"aa2dea9b-4dc5-4223-9c11-13130a44c7d4\") " pod="openshift-controller-manager/controller-manager-6c59d97bd4-8kwdx" Mar 18 11:59:11 crc kubenswrapper[4965]: I0318 11:59:11.404705 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa2dea9b-4dc5-4223-9c11-13130a44c7d4-serving-cert\") pod \"controller-manager-6c59d97bd4-8kwdx\" (UID: \"aa2dea9b-4dc5-4223-9c11-13130a44c7d4\") " pod="openshift-controller-manager/controller-manager-6c59d97bd4-8kwdx" Mar 18 11:59:11 crc kubenswrapper[4965]: I0318 11:59:11.404830 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aa2dea9b-4dc5-4223-9c11-13130a44c7d4-proxy-ca-bundles\") pod \"controller-manager-6c59d97bd4-8kwdx\" (UID: \"aa2dea9b-4dc5-4223-9c11-13130a44c7d4\") " pod="openshift-controller-manager/controller-manager-6c59d97bd4-8kwdx" Mar 18 11:59:11 crc kubenswrapper[4965]: I0318 11:59:11.404883 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aa2dea9b-4dc5-4223-9c11-13130a44c7d4-client-ca\") pod \"controller-manager-6c59d97bd4-8kwdx\" (UID: \"aa2dea9b-4dc5-4223-9c11-13130a44c7d4\") " pod="openshift-controller-manager/controller-manager-6c59d97bd4-8kwdx" Mar 18 11:59:11 crc kubenswrapper[4965]: I0318 11:59:11.404959 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa2dea9b-4dc5-4223-9c11-13130a44c7d4-config\") pod \"controller-manager-6c59d97bd4-8kwdx\" (UID: \"aa2dea9b-4dc5-4223-9c11-13130a44c7d4\") " pod="openshift-controller-manager/controller-manager-6c59d97bd4-8kwdx" Mar 18 11:59:11 crc kubenswrapper[4965]: I0318 11:59:11.404989 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dlgv\" (UniqueName: \"kubernetes.io/projected/aa2dea9b-4dc5-4223-9c11-13130a44c7d4-kube-api-access-6dlgv\") pod \"controller-manager-6c59d97bd4-8kwdx\" (UID: \"aa2dea9b-4dc5-4223-9c11-13130a44c7d4\") " pod="openshift-controller-manager/controller-manager-6c59d97bd4-8kwdx" Mar 18 11:59:11 crc kubenswrapper[4965]: I0318 11:59:11.406131 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aa2dea9b-4dc5-4223-9c11-13130a44c7d4-proxy-ca-bundles\") pod \"controller-manager-6c59d97bd4-8kwdx\" (UID: \"aa2dea9b-4dc5-4223-9c11-13130a44c7d4\") " pod="openshift-controller-manager/controller-manager-6c59d97bd4-8kwdx" Mar 18 11:59:11 crc kubenswrapper[4965]: I0318 11:59:11.406142 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aa2dea9b-4dc5-4223-9c11-13130a44c7d4-client-ca\") pod \"controller-manager-6c59d97bd4-8kwdx\" (UID: \"aa2dea9b-4dc5-4223-9c11-13130a44c7d4\") " pod="openshift-controller-manager/controller-manager-6c59d97bd4-8kwdx" Mar 18 11:59:11 crc kubenswrapper[4965]: I0318 11:59:11.408317 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa2dea9b-4dc5-4223-9c11-13130a44c7d4-config\") pod \"controller-manager-6c59d97bd4-8kwdx\" (UID: \"aa2dea9b-4dc5-4223-9c11-13130a44c7d4\") " pod="openshift-controller-manager/controller-manager-6c59d97bd4-8kwdx" Mar 18 11:59:11 crc kubenswrapper[4965]: I0318 11:59:11.410205 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa2dea9b-4dc5-4223-9c11-13130a44c7d4-serving-cert\") pod \"controller-manager-6c59d97bd4-8kwdx\" (UID: \"aa2dea9b-4dc5-4223-9c11-13130a44c7d4\") " pod="openshift-controller-manager/controller-manager-6c59d97bd4-8kwdx" Mar 18 11:59:11 crc kubenswrapper[4965]: I0318 11:59:11.419528 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dlgv\" (UniqueName: \"kubernetes.io/projected/aa2dea9b-4dc5-4223-9c11-13130a44c7d4-kube-api-access-6dlgv\") pod \"controller-manager-6c59d97bd4-8kwdx\" (UID: \"aa2dea9b-4dc5-4223-9c11-13130a44c7d4\") " pod="openshift-controller-manager/controller-manager-6c59d97bd4-8kwdx" Mar 18 11:59:11 crc kubenswrapper[4965]: I0318 11:59:11.591866 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c59d97bd4-8kwdx" Mar 18 11:59:11 crc kubenswrapper[4965]: I0318 11:59:11.928489 4965 generic.go:334] "Generic (PLEG): container finished" podID="7ada1365-a4aa-4a56-8fc6-e71aa7933bc8" containerID="79a9d80f860472ebc48c11ba5e0e856181e733dd0422d3c3c76de8a98c86a519" exitCode=0 Mar 18 11:59:11 crc kubenswrapper[4965]: I0318 11:59:11.928546 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"7ada1365-a4aa-4a56-8fc6-e71aa7933bc8","Type":"ContainerDied","Data":"79a9d80f860472ebc48c11ba5e0e856181e733dd0422d3c3c76de8a98c86a519"} Mar 18 11:59:11 crc kubenswrapper[4965]: I0318 11:59:11.930548 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67944cbc48-46jc2" event={"ID":"e37096ec-855c-47b4-8815-6d54a2b95c50","Type":"ContainerStarted","Data":"38cc7a8e788b31c6831a1d32642b0b1896569df44a123a2cbe6efd94bb385f1d"} Mar 18 11:59:11 crc kubenswrapper[4965]: I0318 11:59:11.930599 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67944cbc48-46jc2" event={"ID":"e37096ec-855c-47b4-8815-6d54a2b95c50","Type":"ContainerStarted","Data":"9aa87e61365b541a4fef414f88d37b0291e353b2512ea1a23e238225626b4735"} Mar 18 11:59:11 crc kubenswrapper[4965]: I0318 11:59:11.930788 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-67944cbc48-46jc2" Mar 18 11:59:11 crc kubenswrapper[4965]: I0318 11:59:11.936226 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-67944cbc48-46jc2" Mar 18 11:59:11 crc kubenswrapper[4965]: I0318 11:59:11.960097 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-67944cbc48-46jc2" podStartSLOduration=3.96008166 podStartE2EDuration="3.96008166s" podCreationTimestamp="2026-03-18 11:59:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:59:11.956935229 +0000 UTC m=+156.943122708" watchObservedRunningTime="2026-03-18 11:59:11.96008166 +0000 UTC m=+156.946269139" Mar 18 11:59:12 crc kubenswrapper[4965]: W0318 11:59:12.033629 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa2dea9b_4dc5_4223_9c11_13130a44c7d4.slice/crio-7fa63888579720b78109d270f7d4644f965f94e3049f73b6dc6f60afa354b227 WatchSource:0}: Error finding container 7fa63888579720b78109d270f7d4644f965f94e3049f73b6dc6f60afa354b227: Status 404 returned error can't find the container with id 7fa63888579720b78109d270f7d4644f965f94e3049f73b6dc6f60afa354b227 Mar 18 11:59:12 crc kubenswrapper[4965]: I0318 11:59:12.036068 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19d94bd2-7927-466e-a14a-b2f25275b653" path="/var/lib/kubelet/pods/19d94bd2-7927-466e-a14a-b2f25275b653/volumes" Mar 18 11:59:12 crc kubenswrapper[4965]: I0318 11:59:12.037149 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c59d97bd4-8kwdx"] Mar 18 11:59:12 crc kubenswrapper[4965]: I0318 11:59:12.937076 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c59d97bd4-8kwdx" event={"ID":"aa2dea9b-4dc5-4223-9c11-13130a44c7d4","Type":"ContainerStarted","Data":"2cbe35b7326e966f07dc3dec53832d3f906e8b92f9e07aab3e3be530bc030f88"} Mar 18 11:59:12 crc kubenswrapper[4965]: I0318 11:59:12.937625 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6c59d97bd4-8kwdx" Mar 18 11:59:12 crc kubenswrapper[4965]: I0318 11:59:12.937682 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c59d97bd4-8kwdx" event={"ID":"aa2dea9b-4dc5-4223-9c11-13130a44c7d4","Type":"ContainerStarted","Data":"7fa63888579720b78109d270f7d4644f965f94e3049f73b6dc6f60afa354b227"} Mar 18 11:59:12 crc kubenswrapper[4965]: I0318 11:59:12.942595 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6c59d97bd4-8kwdx" Mar 18 11:59:12 crc kubenswrapper[4965]: I0318 11:59:12.960856 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6c59d97bd4-8kwdx" podStartSLOduration=4.960835681 podStartE2EDuration="4.960835681s" podCreationTimestamp="2026-03-18 11:59:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:59:12.958823213 +0000 UTC m=+157.945010692" watchObservedRunningTime="2026-03-18 11:59:12.960835681 +0000 UTC m=+157.947023160" Mar 18 11:59:12 crc kubenswrapper[4965]: I0318 11:59:12.982698 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sjkft" Mar 18 11:59:12 crc kubenswrapper[4965]: I0318 11:59:12.982741 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sjkft" Mar 18 11:59:13 crc kubenswrapper[4965]: I0318 11:59:13.142218 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sjkft" Mar 18 11:59:13 crc kubenswrapper[4965]: I0318 11:59:13.191264 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 11:59:13 crc kubenswrapper[4965]: I0318 11:59:13.230543 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7ada1365-a4aa-4a56-8fc6-e71aa7933bc8-kube-api-access\") pod \"7ada1365-a4aa-4a56-8fc6-e71aa7933bc8\" (UID: \"7ada1365-a4aa-4a56-8fc6-e71aa7933bc8\") " Mar 18 11:59:13 crc kubenswrapper[4965]: I0318 11:59:13.230674 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7ada1365-a4aa-4a56-8fc6-e71aa7933bc8-kubelet-dir\") pod \"7ada1365-a4aa-4a56-8fc6-e71aa7933bc8\" (UID: \"7ada1365-a4aa-4a56-8fc6-e71aa7933bc8\") " Mar 18 11:59:13 crc kubenswrapper[4965]: I0318 11:59:13.230746 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ada1365-a4aa-4a56-8fc6-e71aa7933bc8-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7ada1365-a4aa-4a56-8fc6-e71aa7933bc8" (UID: "7ada1365-a4aa-4a56-8fc6-e71aa7933bc8"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 11:59:13 crc kubenswrapper[4965]: I0318 11:59:13.231050 4965 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7ada1365-a4aa-4a56-8fc6-e71aa7933bc8-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:13 crc kubenswrapper[4965]: I0318 11:59:13.241792 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ada1365-a4aa-4a56-8fc6-e71aa7933bc8-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7ada1365-a4aa-4a56-8fc6-e71aa7933bc8" (UID: "7ada1365-a4aa-4a56-8fc6-e71aa7933bc8"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:59:13 crc kubenswrapper[4965]: I0318 11:59:13.332076 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7ada1365-a4aa-4a56-8fc6-e71aa7933bc8-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:13 crc kubenswrapper[4965]: I0318 11:59:13.945832 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 11:59:13 crc kubenswrapper[4965]: I0318 11:59:13.945861 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"7ada1365-a4aa-4a56-8fc6-e71aa7933bc8","Type":"ContainerDied","Data":"fad869a2532c0519aa2dc58f326c791caa5333d034d463b70867ad1bc3d1144d"} Mar 18 11:59:13 crc kubenswrapper[4965]: I0318 11:59:13.945938 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fad869a2532c0519aa2dc58f326c791caa5333d034d463b70867ad1bc3d1144d" Mar 18 11:59:13 crc kubenswrapper[4965]: I0318 11:59:13.987675 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sjkft" Mar 18 11:59:15 crc kubenswrapper[4965]: I0318 11:59:15.246481 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 11:59:15 crc kubenswrapper[4965]: E0318 11:59:15.247063 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ada1365-a4aa-4a56-8fc6-e71aa7933bc8" containerName="pruner" Mar 18 11:59:15 crc kubenswrapper[4965]: I0318 11:59:15.247079 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ada1365-a4aa-4a56-8fc6-e71aa7933bc8" containerName="pruner" Mar 18 11:59:15 crc kubenswrapper[4965]: I0318 11:59:15.247216 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ada1365-a4aa-4a56-8fc6-e71aa7933bc8" containerName="pruner" Mar 18 11:59:15 crc kubenswrapper[4965]: I0318 11:59:15.247675 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 11:59:15 crc kubenswrapper[4965]: I0318 11:59:15.250044 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 18 11:59:15 crc kubenswrapper[4965]: I0318 11:59:15.250310 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 18 11:59:15 crc kubenswrapper[4965]: I0318 11:59:15.256953 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 11:59:15 crc kubenswrapper[4965]: I0318 11:59:15.358420 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2f56f075-8aa8-4030-849a-ae1fc80135ef-kubelet-dir\") pod \"installer-9-crc\" (UID: \"2f56f075-8aa8-4030-849a-ae1fc80135ef\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 11:59:15 crc kubenswrapper[4965]: I0318 11:59:15.358505 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2f56f075-8aa8-4030-849a-ae1fc80135ef-var-lock\") pod \"installer-9-crc\" (UID: \"2f56f075-8aa8-4030-849a-ae1fc80135ef\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 11:59:15 crc kubenswrapper[4965]: I0318 11:59:15.358610 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2f56f075-8aa8-4030-849a-ae1fc80135ef-kube-api-access\") pod \"installer-9-crc\" (UID: \"2f56f075-8aa8-4030-849a-ae1fc80135ef\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 11:59:15 crc kubenswrapper[4965]: I0318 11:59:15.459417 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2f56f075-8aa8-4030-849a-ae1fc80135ef-kubelet-dir\") pod \"installer-9-crc\" (UID: \"2f56f075-8aa8-4030-849a-ae1fc80135ef\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 11:59:15 crc kubenswrapper[4965]: I0318 11:59:15.459521 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2f56f075-8aa8-4030-849a-ae1fc80135ef-kubelet-dir\") pod \"installer-9-crc\" (UID: \"2f56f075-8aa8-4030-849a-ae1fc80135ef\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 11:59:15 crc kubenswrapper[4965]: I0318 11:59:15.459715 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2f56f075-8aa8-4030-849a-ae1fc80135ef-var-lock\") pod \"installer-9-crc\" (UID: \"2f56f075-8aa8-4030-849a-ae1fc80135ef\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 11:59:15 crc kubenswrapper[4965]: I0318 11:59:15.459798 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2f56f075-8aa8-4030-849a-ae1fc80135ef-var-lock\") pod \"installer-9-crc\" (UID: \"2f56f075-8aa8-4030-849a-ae1fc80135ef\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 11:59:15 crc kubenswrapper[4965]: I0318 11:59:15.459945 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2f56f075-8aa8-4030-849a-ae1fc80135ef-kube-api-access\") pod \"installer-9-crc\" (UID: \"2f56f075-8aa8-4030-849a-ae1fc80135ef\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 11:59:15 crc kubenswrapper[4965]: I0318 11:59:15.480768 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2f56f075-8aa8-4030-849a-ae1fc80135ef-kube-api-access\") pod \"installer-9-crc\" (UID: \"2f56f075-8aa8-4030-849a-ae1fc80135ef\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 11:59:15 crc kubenswrapper[4965]: I0318 11:59:15.569754 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 11:59:15 crc kubenswrapper[4965]: I0318 11:59:15.964387 4965 generic.go:334] "Generic (PLEG): container finished" podID="11860b27-ab66-423c-9939-df595c023a38" containerID="faea6ef8290514a116d02628b9e683c70465eed2a2c25761f989c85155e7207e" exitCode=0 Mar 18 11:59:15 crc kubenswrapper[4965]: I0318 11:59:15.964427 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9btzt" event={"ID":"11860b27-ab66-423c-9939-df595c023a38","Type":"ContainerDied","Data":"faea6ef8290514a116d02628b9e683c70465eed2a2c25761f989c85155e7207e"} Mar 18 11:59:15 crc kubenswrapper[4965]: I0318 11:59:15.968582 4965 generic.go:334] "Generic (PLEG): container finished" podID="8edec177-0701-4d1c-bb61-33c0a05df51d" containerID="9fca9c99465cc9f8ca5804e92a204bd74f72ef3c66a34d005141ee2a779f084d" exitCode=0 Mar 18 11:59:15 crc kubenswrapper[4965]: I0318 11:59:15.968619 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zx7g7" event={"ID":"8edec177-0701-4d1c-bb61-33c0a05df51d","Type":"ContainerDied","Data":"9fca9c99465cc9f8ca5804e92a204bd74f72ef3c66a34d005141ee2a779f084d"} Mar 18 11:59:15 crc kubenswrapper[4965]: I0318 11:59:15.985734 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 11:59:16 crc kubenswrapper[4965]: I0318 11:59:16.976198 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9btzt" event={"ID":"11860b27-ab66-423c-9939-df595c023a38","Type":"ContainerStarted","Data":"c37f8ae4be61b88504dbd877e673828ba7307420fa6186794fd60859edf603ba"} Mar 18 11:59:16 crc kubenswrapper[4965]: I0318 11:59:16.978085 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zx7g7" event={"ID":"8edec177-0701-4d1c-bb61-33c0a05df51d","Type":"ContainerStarted","Data":"f74cec4d4cd2fc07c553b2ba8be7311beb1dd80ea98cfdb269f4ac0cae6f1f6d"} Mar 18 11:59:16 crc kubenswrapper[4965]: I0318 11:59:16.979926 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"2f56f075-8aa8-4030-849a-ae1fc80135ef","Type":"ContainerStarted","Data":"3851d115e5176fbc2d45309d5534661867861818d77bc7e54f548c170a496872"} Mar 18 11:59:16 crc kubenswrapper[4965]: I0318 11:59:16.980036 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"2f56f075-8aa8-4030-849a-ae1fc80135ef","Type":"ContainerStarted","Data":"265d94ea60971145ea8cf5dfefabe26919a9b69d16dcb2c90b292a6768ed0593"} Mar 18 11:59:16 crc kubenswrapper[4965]: I0318 11:59:16.998152 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9btzt" podStartSLOduration=3.017583197 podStartE2EDuration="44.99813291s" podCreationTimestamp="2026-03-18 11:58:32 +0000 UTC" firstStartedPulling="2026-03-18 11:58:34.424477053 +0000 UTC m=+119.410664542" lastFinishedPulling="2026-03-18 11:59:16.405026776 +0000 UTC m=+161.391214255" observedRunningTime="2026-03-18 11:59:16.996638666 +0000 UTC m=+161.982826145" watchObservedRunningTime="2026-03-18 11:59:16.99813291 +0000 UTC m=+161.984320389" Mar 18 11:59:17 crc kubenswrapper[4965]: I0318 11:59:17.022053 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zx7g7" podStartSLOduration=3.135892785 podStartE2EDuration="42.022035776s" podCreationTimestamp="2026-03-18 11:58:35 +0000 UTC" firstStartedPulling="2026-03-18 11:58:37.589311845 +0000 UTC m=+122.575499324" lastFinishedPulling="2026-03-18 11:59:16.475454836 +0000 UTC m=+161.461642315" observedRunningTime="2026-03-18 11:59:17.019646606 +0000 UTC m=+162.005834095" watchObservedRunningTime="2026-03-18 11:59:17.022035776 +0000 UTC m=+162.008223265" Mar 18 11:59:17 crc kubenswrapper[4965]: I0318 11:59:17.037959 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.037945629 podStartE2EDuration="2.037945629s" podCreationTimestamp="2026-03-18 11:59:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:59:17.03559605 +0000 UTC m=+162.021783539" watchObservedRunningTime="2026-03-18 11:59:17.037945629 +0000 UTC m=+162.024133108" Mar 18 11:59:18 crc kubenswrapper[4965]: I0318 11:59:18.979682 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 11:59:19 crc kubenswrapper[4965]: I0318 11:59:19.013019 4965 generic.go:334] "Generic (PLEG): container finished" podID="7a889591-6c1d-4940-9eb2-28cfa6988f89" containerID="bca7a5ae0df35e73e261fbcd671a32dc151ddb187c3745e8d013aa4ac3727703" exitCode=0 Mar 18 11:59:19 crc kubenswrapper[4965]: I0318 11:59:19.013170 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rkz24" event={"ID":"7a889591-6c1d-4940-9eb2-28cfa6988f89","Type":"ContainerDied","Data":"bca7a5ae0df35e73e261fbcd671a32dc151ddb187c3745e8d013aa4ac3727703"} Mar 18 11:59:19 crc kubenswrapper[4965]: I0318 11:59:19.090795 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 18 11:59:20 crc kubenswrapper[4965]: I0318 11:59:20.024835 4965 generic.go:334] "Generic (PLEG): container finished" podID="f76b91e1-f768-4cb1-857d-6e3eb31e59f6" containerID="954f130938e5373601c9fd30dac626fc3554b044dec5760e4a00b31a5ef6d876" exitCode=0 Mar 18 11:59:20 crc kubenswrapper[4965]: I0318 11:59:20.026601 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rkz24" event={"ID":"7a889591-6c1d-4940-9eb2-28cfa6988f89","Type":"ContainerStarted","Data":"fc45ce8c7c6bc50001b3f22b7386e5929de97d9356ab7aaff516a11765d521dd"} Mar 18 11:59:20 crc kubenswrapper[4965]: I0318 11:59:20.026646 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gwpmg" event={"ID":"f76b91e1-f768-4cb1-857d-6e3eb31e59f6","Type":"ContainerDied","Data":"954f130938e5373601c9fd30dac626fc3554b044dec5760e4a00b31a5ef6d876"} Mar 18 11:59:20 crc kubenswrapper[4965]: I0318 11:59:20.036188 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bq6m2" event={"ID":"6c7d53d7-b4fc-48d5-b8f4-545ea972697e","Type":"ContainerStarted","Data":"9261b312524f789032b5d10f99b55317718d7d02ad982aa7199d7f77d2e70d43"} Mar 18 11:59:20 crc kubenswrapper[4965]: I0318 11:59:20.038641 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5wjlx" event={"ID":"39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5","Type":"ContainerStarted","Data":"5d8809f5914ab3ab48d3714cbcaa316e1897b7b7460703533a0bfb1a835f245d"} Mar 18 11:59:20 crc kubenswrapper[4965]: I0318 11:59:20.047431 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rkz24" podStartSLOduration=3.197864238 podStartE2EDuration="45.04741513s" podCreationTimestamp="2026-03-18 11:58:35 +0000 UTC" firstStartedPulling="2026-03-18 11:58:37.58297903 +0000 UTC m=+122.569166509" lastFinishedPulling="2026-03-18 11:59:19.432529922 +0000 UTC m=+164.418717401" observedRunningTime="2026-03-18 11:59:20.0460137 +0000 UTC m=+165.032201179" watchObservedRunningTime="2026-03-18 11:59:20.04741513 +0000 UTC m=+165.033602609" Mar 18 11:59:20 crc kubenswrapper[4965]: I0318 11:59:20.114511 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=1.114491673 podStartE2EDuration="1.114491673s" podCreationTimestamp="2026-03-18 11:59:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:59:20.110174657 +0000 UTC m=+165.096362136" watchObservedRunningTime="2026-03-18 11:59:20.114491673 +0000 UTC m=+165.100679152" Mar 18 11:59:21 crc kubenswrapper[4965]: I0318 11:59:21.046428 4965 generic.go:334] "Generic (PLEG): container finished" podID="4329347a-487b-4012-a715-9565bc4e67d0" containerID="d475773f27c577265c846fe389b488c2e03ce76830a459d5a269c4731d31a6df" exitCode=0 Mar 18 11:59:21 crc kubenswrapper[4965]: I0318 11:59:21.046507 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6pk8b" event={"ID":"4329347a-487b-4012-a715-9565bc4e67d0","Type":"ContainerDied","Data":"d475773f27c577265c846fe389b488c2e03ce76830a459d5a269c4731d31a6df"} Mar 18 11:59:21 crc kubenswrapper[4965]: I0318 11:59:21.049209 4965 generic.go:334] "Generic (PLEG): container finished" podID="6c7d53d7-b4fc-48d5-b8f4-545ea972697e" containerID="9261b312524f789032b5d10f99b55317718d7d02ad982aa7199d7f77d2e70d43" exitCode=0 Mar 18 11:59:21 crc kubenswrapper[4965]: I0318 11:59:21.049301 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bq6m2" event={"ID":"6c7d53d7-b4fc-48d5-b8f4-545ea972697e","Type":"ContainerDied","Data":"9261b312524f789032b5d10f99b55317718d7d02ad982aa7199d7f77d2e70d43"} Mar 18 11:59:21 crc kubenswrapper[4965]: I0318 11:59:21.056738 4965 generic.go:334] "Generic (PLEG): container finished" podID="39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5" containerID="5d8809f5914ab3ab48d3714cbcaa316e1897b7b7460703533a0bfb1a835f245d" exitCode=0 Mar 18 11:59:21 crc kubenswrapper[4965]: I0318 11:59:21.056789 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5wjlx" event={"ID":"39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5","Type":"ContainerDied","Data":"5d8809f5914ab3ab48d3714cbcaa316e1897b7b7460703533a0bfb1a835f245d"} Mar 18 11:59:23 crc kubenswrapper[4965]: I0318 11:59:23.070334 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gwpmg" event={"ID":"f76b91e1-f768-4cb1-857d-6e3eb31e59f6","Type":"ContainerStarted","Data":"c4325f73e755e8ce62ff9b4bd7e7d386a96dcca05fca2e9e97fbaef59eb9446e"} Mar 18 11:59:23 crc kubenswrapper[4965]: I0318 11:59:23.092093 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gwpmg" podStartSLOduration=3.3793275019999998 podStartE2EDuration="49.092069936s" podCreationTimestamp="2026-03-18 11:58:34 +0000 UTC" firstStartedPulling="2026-03-18 11:58:36.567585663 +0000 UTC m=+121.553773142" lastFinishedPulling="2026-03-18 11:59:22.280328097 +0000 UTC m=+167.266515576" observedRunningTime="2026-03-18 11:59:23.088055359 +0000 UTC m=+168.074242858" watchObservedRunningTime="2026-03-18 11:59:23.092069936 +0000 UTC m=+168.078257415" Mar 18 11:59:23 crc kubenswrapper[4965]: I0318 11:59:23.149384 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9btzt" Mar 18 11:59:23 crc kubenswrapper[4965]: I0318 11:59:23.149441 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9btzt" Mar 18 11:59:23 crc kubenswrapper[4965]: I0318 11:59:23.185757 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9btzt" Mar 18 11:59:24 crc kubenswrapper[4965]: I0318 11:59:24.076168 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6pk8b" event={"ID":"4329347a-487b-4012-a715-9565bc4e67d0","Type":"ContainerStarted","Data":"14eeb82905df737bac64135f574c93cb7221347906949fdd72c4d9a4f04a0b89"} Mar 18 11:59:24 crc kubenswrapper[4965]: I0318 11:59:24.077865 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bq6m2" event={"ID":"6c7d53d7-b4fc-48d5-b8f4-545ea972697e","Type":"ContainerStarted","Data":"271f75a53c1cf419b7e3cfc294577dad2d7a027a42854facf90df16ff6a273c8"} Mar 18 11:59:24 crc kubenswrapper[4965]: I0318 11:59:24.080116 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5wjlx" event={"ID":"39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5","Type":"ContainerStarted","Data":"9c047746c7fee720b87490d138715f8689974916b5f46538d70fb83512b0bf6f"} Mar 18 11:59:24 crc kubenswrapper[4965]: I0318 11:59:24.098524 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6pk8b" podStartSLOduration=1.776947919 podStartE2EDuration="51.098509852s" podCreationTimestamp="2026-03-18 11:58:33 +0000 UTC" firstStartedPulling="2026-03-18 11:58:34.253604845 +0000 UTC m=+119.239792324" lastFinishedPulling="2026-03-18 11:59:23.575166778 +0000 UTC m=+168.561354257" observedRunningTime="2026-03-18 11:59:24.096574285 +0000 UTC m=+169.082761764" watchObservedRunningTime="2026-03-18 11:59:24.098509852 +0000 UTC m=+169.084697331" Mar 18 11:59:24 crc kubenswrapper[4965]: I0318 11:59:24.115226 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5wjlx" podStartSLOduration=3.178824967 podStartE2EDuration="52.115192436s" podCreationTimestamp="2026-03-18 11:58:32 +0000 UTC" firstStartedPulling="2026-03-18 11:58:34.239769484 +0000 UTC m=+119.225956963" lastFinishedPulling="2026-03-18 11:59:23.176136953 +0000 UTC m=+168.162324432" observedRunningTime="2026-03-18 11:59:24.113798606 +0000 UTC m=+169.099986095" watchObservedRunningTime="2026-03-18 11:59:24.115192436 +0000 UTC m=+169.101379915" Mar 18 11:59:24 crc kubenswrapper[4965]: I0318 11:59:24.126969 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9btzt" Mar 18 11:59:24 crc kubenswrapper[4965]: I0318 11:59:24.132565 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bq6m2" podStartSLOduration=2.170936031 podStartE2EDuration="48.132555302s" podCreationTimestamp="2026-03-18 11:58:36 +0000 UTC" firstStartedPulling="2026-03-18 11:58:37.597541851 +0000 UTC m=+122.583729330" lastFinishedPulling="2026-03-18 11:59:23.559161122 +0000 UTC m=+168.545348601" observedRunningTime="2026-03-18 11:59:24.130400059 +0000 UTC m=+169.116587538" watchObservedRunningTime="2026-03-18 11:59:24.132555302 +0000 UTC m=+169.118742781" Mar 18 11:59:25 crc kubenswrapper[4965]: I0318 11:59:25.009561 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gwpmg" Mar 18 11:59:25 crc kubenswrapper[4965]: I0318 11:59:25.011346 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gwpmg" Mar 18 11:59:25 crc kubenswrapper[4965]: I0318 11:59:25.066879 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gwpmg" Mar 18 11:59:25 crc kubenswrapper[4965]: I0318 11:59:25.396496 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rkz24" Mar 18 11:59:25 crc kubenswrapper[4965]: I0318 11:59:25.396540 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rkz24" Mar 18 11:59:25 crc kubenswrapper[4965]: I0318 11:59:25.443451 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rkz24" Mar 18 11:59:25 crc kubenswrapper[4965]: I0318 11:59:25.575321 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9btzt"] Mar 18 11:59:26 crc kubenswrapper[4965]: I0318 11:59:26.058842 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zx7g7" Mar 18 11:59:26 crc kubenswrapper[4965]: I0318 11:59:26.059159 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zx7g7" Mar 18 11:59:26 crc kubenswrapper[4965]: I0318 11:59:26.091649 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9btzt" podUID="11860b27-ab66-423c-9939-df595c023a38" containerName="registry-server" containerID="cri-o://c37f8ae4be61b88504dbd877e673828ba7307420fa6186794fd60859edf603ba" gracePeriod=2 Mar 18 11:59:26 crc kubenswrapper[4965]: I0318 11:59:26.108009 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zx7g7" Mar 18 11:59:26 crc kubenswrapper[4965]: I0318 11:59:26.173356 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rkz24" Mar 18 11:59:26 crc kubenswrapper[4965]: I0318 11:59:26.436933 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bq6m2" Mar 18 11:59:26 crc kubenswrapper[4965]: I0318 11:59:26.442731 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bq6m2" Mar 18 11:59:27 crc kubenswrapper[4965]: I0318 11:59:27.099647 4965 generic.go:334] "Generic (PLEG): container finished" podID="11860b27-ab66-423c-9939-df595c023a38" containerID="c37f8ae4be61b88504dbd877e673828ba7307420fa6186794fd60859edf603ba" exitCode=0 Mar 18 11:59:27 crc kubenswrapper[4965]: I0318 11:59:27.099726 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9btzt" event={"ID":"11860b27-ab66-423c-9939-df595c023a38","Type":"ContainerDied","Data":"c37f8ae4be61b88504dbd877e673828ba7307420fa6186794fd60859edf603ba"} Mar 18 11:59:27 crc kubenswrapper[4965]: I0318 11:59:27.165959 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zx7g7" Mar 18 11:59:27 crc kubenswrapper[4965]: I0318 11:59:27.487577 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bq6m2" podUID="6c7d53d7-b4fc-48d5-b8f4-545ea972697e" containerName="registry-server" probeResult="failure" output=< Mar 18 11:59:27 crc kubenswrapper[4965]: timeout: failed to connect service ":50051" within 1s Mar 18 11:59:27 crc kubenswrapper[4965]: > Mar 18 11:59:27 crc kubenswrapper[4965]: I0318 11:59:27.531730 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9btzt" Mar 18 11:59:27 crc kubenswrapper[4965]: I0318 11:59:27.717562 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11860b27-ab66-423c-9939-df595c023a38-utilities\") pod \"11860b27-ab66-423c-9939-df595c023a38\" (UID: \"11860b27-ab66-423c-9939-df595c023a38\") " Mar 18 11:59:27 crc kubenswrapper[4965]: I0318 11:59:27.717610 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6wdl\" (UniqueName: \"kubernetes.io/projected/11860b27-ab66-423c-9939-df595c023a38-kube-api-access-h6wdl\") pod \"11860b27-ab66-423c-9939-df595c023a38\" (UID: \"11860b27-ab66-423c-9939-df595c023a38\") " Mar 18 11:59:27 crc kubenswrapper[4965]: I0318 11:59:27.717937 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11860b27-ab66-423c-9939-df595c023a38-catalog-content\") pod \"11860b27-ab66-423c-9939-df595c023a38\" (UID: \"11860b27-ab66-423c-9939-df595c023a38\") " Mar 18 11:59:27 crc kubenswrapper[4965]: I0318 11:59:27.719424 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11860b27-ab66-423c-9939-df595c023a38-utilities" (OuterVolumeSpecName: "utilities") pod "11860b27-ab66-423c-9939-df595c023a38" (UID: "11860b27-ab66-423c-9939-df595c023a38"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:59:27 crc kubenswrapper[4965]: I0318 11:59:27.722715 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11860b27-ab66-423c-9939-df595c023a38-kube-api-access-h6wdl" (OuterVolumeSpecName: "kube-api-access-h6wdl") pod "11860b27-ab66-423c-9939-df595c023a38" (UID: "11860b27-ab66-423c-9939-df595c023a38"). InnerVolumeSpecName "kube-api-access-h6wdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:59:27 crc kubenswrapper[4965]: I0318 11:59:27.819122 4965 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11860b27-ab66-423c-9939-df595c023a38-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:27 crc kubenswrapper[4965]: I0318 11:59:27.819162 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6wdl\" (UniqueName: \"kubernetes.io/projected/11860b27-ab66-423c-9939-df595c023a38-kube-api-access-h6wdl\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:27 crc kubenswrapper[4965]: I0318 11:59:27.899083 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11860b27-ab66-423c-9939-df595c023a38-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "11860b27-ab66-423c-9939-df595c023a38" (UID: "11860b27-ab66-423c-9939-df595c023a38"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:59:27 crc kubenswrapper[4965]: I0318 11:59:27.920222 4965 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11860b27-ab66-423c-9939-df595c023a38-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:28 crc kubenswrapper[4965]: I0318 11:59:28.107391 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9btzt" Mar 18 11:59:28 crc kubenswrapper[4965]: I0318 11:59:28.107799 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9btzt" event={"ID":"11860b27-ab66-423c-9939-df595c023a38","Type":"ContainerDied","Data":"1f708c444020d7f9fbc4a4abea9811bd3a12dfd982ad027969b7f6f7782e477c"} Mar 18 11:59:28 crc kubenswrapper[4965]: I0318 11:59:28.107831 4965 scope.go:117] "RemoveContainer" containerID="c37f8ae4be61b88504dbd877e673828ba7307420fa6186794fd60859edf603ba" Mar 18 11:59:28 crc kubenswrapper[4965]: I0318 11:59:28.125568 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9btzt"] Mar 18 11:59:28 crc kubenswrapper[4965]: I0318 11:59:28.125775 4965 scope.go:117] "RemoveContainer" containerID="faea6ef8290514a116d02628b9e683c70465eed2a2c25761f989c85155e7207e" Mar 18 11:59:28 crc kubenswrapper[4965]: I0318 11:59:28.129619 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9btzt"] Mar 18 11:59:28 crc kubenswrapper[4965]: I0318 11:59:28.173305 4965 scope.go:117] "RemoveContainer" containerID="684ca9c8a152ea91a14aa7f18c1bc9c7c7190a48a5a6507329432f8b03064b25" Mar 18 11:59:28 crc kubenswrapper[4965]: I0318 11:59:28.247864 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c59d97bd4-8kwdx"] Mar 18 11:59:28 crc kubenswrapper[4965]: I0318 11:59:28.248094 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6c59d97bd4-8kwdx" podUID="aa2dea9b-4dc5-4223-9c11-13130a44c7d4" containerName="controller-manager" containerID="cri-o://2cbe35b7326e966f07dc3dec53832d3f906e8b92f9e07aab3e3be530bc030f88" gracePeriod=30 Mar 18 11:59:28 crc kubenswrapper[4965]: I0318 11:59:28.252851 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67944cbc48-46jc2"] Mar 18 11:59:28 crc kubenswrapper[4965]: I0318 11:59:28.253109 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-67944cbc48-46jc2" podUID="e37096ec-855c-47b4-8815-6d54a2b95c50" containerName="route-controller-manager" containerID="cri-o://38cc7a8e788b31c6831a1d32642b0b1896569df44a123a2cbe6efd94bb385f1d" gracePeriod=30 Mar 18 11:59:29 crc kubenswrapper[4965]: I0318 11:59:29.113807 4965 generic.go:334] "Generic (PLEG): container finished" podID="aa2dea9b-4dc5-4223-9c11-13130a44c7d4" containerID="2cbe35b7326e966f07dc3dec53832d3f906e8b92f9e07aab3e3be530bc030f88" exitCode=0 Mar 18 11:59:29 crc kubenswrapper[4965]: I0318 11:59:29.113918 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c59d97bd4-8kwdx" event={"ID":"aa2dea9b-4dc5-4223-9c11-13130a44c7d4","Type":"ContainerDied","Data":"2cbe35b7326e966f07dc3dec53832d3f906e8b92f9e07aab3e3be530bc030f88"} Mar 18 11:59:29 crc kubenswrapper[4965]: I0318 11:59:29.373843 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rkz24"] Mar 18 11:59:29 crc kubenswrapper[4965]: I0318 11:59:29.374126 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rkz24" podUID="7a889591-6c1d-4940-9eb2-28cfa6988f89" containerName="registry-server" containerID="cri-o://fc45ce8c7c6bc50001b3f22b7386e5929de97d9356ab7aaff516a11765d521dd" gracePeriod=2 Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.046797 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11860b27-ab66-423c-9939-df595c023a38" path="/var/lib/kubelet/pods/11860b27-ab66-423c-9939-df595c023a38/volumes" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.129025 4965 generic.go:334] "Generic (PLEG): container finished" podID="e37096ec-855c-47b4-8815-6d54a2b95c50" containerID="38cc7a8e788b31c6831a1d32642b0b1896569df44a123a2cbe6efd94bb385f1d" exitCode=0 Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.129074 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67944cbc48-46jc2" event={"ID":"e37096ec-855c-47b4-8815-6d54a2b95c50","Type":"ContainerDied","Data":"38cc7a8e788b31c6831a1d32642b0b1896569df44a123a2cbe6efd94bb385f1d"} Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.421506 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67944cbc48-46jc2" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.457432 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65cf54489d-xz6nr"] Mar 18 11:59:30 crc kubenswrapper[4965]: E0318 11:59:30.457788 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11860b27-ab66-423c-9939-df595c023a38" containerName="registry-server" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.457811 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="11860b27-ab66-423c-9939-df595c023a38" containerName="registry-server" Mar 18 11:59:30 crc kubenswrapper[4965]: E0318 11:59:30.457822 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e37096ec-855c-47b4-8815-6d54a2b95c50" containerName="route-controller-manager" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.457830 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="e37096ec-855c-47b4-8815-6d54a2b95c50" containerName="route-controller-manager" Mar 18 11:59:30 crc kubenswrapper[4965]: E0318 11:59:30.457841 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11860b27-ab66-423c-9939-df595c023a38" containerName="extract-content" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.457852 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="11860b27-ab66-423c-9939-df595c023a38" containerName="extract-content" Mar 18 11:59:30 crc kubenswrapper[4965]: E0318 11:59:30.457868 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11860b27-ab66-423c-9939-df595c023a38" containerName="extract-utilities" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.457875 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="11860b27-ab66-423c-9939-df595c023a38" containerName="extract-utilities" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.458012 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="e37096ec-855c-47b4-8815-6d54a2b95c50" containerName="route-controller-manager" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.458025 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="11860b27-ab66-423c-9939-df595c023a38" containerName="registry-server" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.458534 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65cf54489d-xz6nr" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.467428 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65cf54489d-xz6nr"] Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.562735 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c59d97bd4-8kwdx" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.568171 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-276ms\" (UniqueName: \"kubernetes.io/projected/e37096ec-855c-47b4-8815-6d54a2b95c50-kube-api-access-276ms\") pod \"e37096ec-855c-47b4-8815-6d54a2b95c50\" (UID: \"e37096ec-855c-47b4-8815-6d54a2b95c50\") " Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.568255 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e37096ec-855c-47b4-8815-6d54a2b95c50-serving-cert\") pod \"e37096ec-855c-47b4-8815-6d54a2b95c50\" (UID: \"e37096ec-855c-47b4-8815-6d54a2b95c50\") " Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.568310 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e37096ec-855c-47b4-8815-6d54a2b95c50-client-ca\") pod \"e37096ec-855c-47b4-8815-6d54a2b95c50\" (UID: \"e37096ec-855c-47b4-8815-6d54a2b95c50\") " Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.568400 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e37096ec-855c-47b4-8815-6d54a2b95c50-config\") pod \"e37096ec-855c-47b4-8815-6d54a2b95c50\" (UID: \"e37096ec-855c-47b4-8815-6d54a2b95c50\") " Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.568626 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd7km\" (UniqueName: \"kubernetes.io/projected/a95184d8-d4d1-429e-919c-629b26c7a5f2-kube-api-access-kd7km\") pod \"route-controller-manager-65cf54489d-xz6nr\" (UID: \"a95184d8-d4d1-429e-919c-629b26c7a5f2\") " pod="openshift-route-controller-manager/route-controller-manager-65cf54489d-xz6nr" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.568728 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a95184d8-d4d1-429e-919c-629b26c7a5f2-serving-cert\") pod \"route-controller-manager-65cf54489d-xz6nr\" (UID: \"a95184d8-d4d1-429e-919c-629b26c7a5f2\") " pod="openshift-route-controller-manager/route-controller-manager-65cf54489d-xz6nr" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.568772 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a95184d8-d4d1-429e-919c-629b26c7a5f2-config\") pod \"route-controller-manager-65cf54489d-xz6nr\" (UID: \"a95184d8-d4d1-429e-919c-629b26c7a5f2\") " pod="openshift-route-controller-manager/route-controller-manager-65cf54489d-xz6nr" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.568794 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a95184d8-d4d1-429e-919c-629b26c7a5f2-client-ca\") pod \"route-controller-manager-65cf54489d-xz6nr\" (UID: \"a95184d8-d4d1-429e-919c-629b26c7a5f2\") " pod="openshift-route-controller-manager/route-controller-manager-65cf54489d-xz6nr" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.569233 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e37096ec-855c-47b4-8815-6d54a2b95c50-client-ca" (OuterVolumeSpecName: "client-ca") pod "e37096ec-855c-47b4-8815-6d54a2b95c50" (UID: "e37096ec-855c-47b4-8815-6d54a2b95c50"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.569255 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e37096ec-855c-47b4-8815-6d54a2b95c50-config" (OuterVolumeSpecName: "config") pod "e37096ec-855c-47b4-8815-6d54a2b95c50" (UID: "e37096ec-855c-47b4-8815-6d54a2b95c50"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.573301 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e37096ec-855c-47b4-8815-6d54a2b95c50-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e37096ec-855c-47b4-8815-6d54a2b95c50" (UID: "e37096ec-855c-47b4-8815-6d54a2b95c50"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.573559 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e37096ec-855c-47b4-8815-6d54a2b95c50-kube-api-access-276ms" (OuterVolumeSpecName: "kube-api-access-276ms") pod "e37096ec-855c-47b4-8815-6d54a2b95c50" (UID: "e37096ec-855c-47b4-8815-6d54a2b95c50"). InnerVolumeSpecName "kube-api-access-276ms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.670047 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aa2dea9b-4dc5-4223-9c11-13130a44c7d4-proxy-ca-bundles\") pod \"aa2dea9b-4dc5-4223-9c11-13130a44c7d4\" (UID: \"aa2dea9b-4dc5-4223-9c11-13130a44c7d4\") " Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.670108 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa2dea9b-4dc5-4223-9c11-13130a44c7d4-serving-cert\") pod \"aa2dea9b-4dc5-4223-9c11-13130a44c7d4\" (UID: \"aa2dea9b-4dc5-4223-9c11-13130a44c7d4\") " Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.670126 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa2dea9b-4dc5-4223-9c11-13130a44c7d4-config\") pod \"aa2dea9b-4dc5-4223-9c11-13130a44c7d4\" (UID: \"aa2dea9b-4dc5-4223-9c11-13130a44c7d4\") " Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.670769 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dlgv\" (UniqueName: \"kubernetes.io/projected/aa2dea9b-4dc5-4223-9c11-13130a44c7d4-kube-api-access-6dlgv\") pod \"aa2dea9b-4dc5-4223-9c11-13130a44c7d4\" (UID: \"aa2dea9b-4dc5-4223-9c11-13130a44c7d4\") " Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.670800 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa2dea9b-4dc5-4223-9c11-13130a44c7d4-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "aa2dea9b-4dc5-4223-9c11-13130a44c7d4" (UID: "aa2dea9b-4dc5-4223-9c11-13130a44c7d4"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.670805 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aa2dea9b-4dc5-4223-9c11-13130a44c7d4-client-ca\") pod \"aa2dea9b-4dc5-4223-9c11-13130a44c7d4\" (UID: \"aa2dea9b-4dc5-4223-9c11-13130a44c7d4\") " Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.670989 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd7km\" (UniqueName: \"kubernetes.io/projected/a95184d8-d4d1-429e-919c-629b26c7a5f2-kube-api-access-kd7km\") pod \"route-controller-manager-65cf54489d-xz6nr\" (UID: \"a95184d8-d4d1-429e-919c-629b26c7a5f2\") " pod="openshift-route-controller-manager/route-controller-manager-65cf54489d-xz6nr" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.671011 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a95184d8-d4d1-429e-919c-629b26c7a5f2-serving-cert\") pod \"route-controller-manager-65cf54489d-xz6nr\" (UID: \"a95184d8-d4d1-429e-919c-629b26c7a5f2\") " pod="openshift-route-controller-manager/route-controller-manager-65cf54489d-xz6nr" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.671044 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a95184d8-d4d1-429e-919c-629b26c7a5f2-config\") pod \"route-controller-manager-65cf54489d-xz6nr\" (UID: \"a95184d8-d4d1-429e-919c-629b26c7a5f2\") " pod="openshift-route-controller-manager/route-controller-manager-65cf54489d-xz6nr" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.671066 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a95184d8-d4d1-429e-919c-629b26c7a5f2-client-ca\") pod \"route-controller-manager-65cf54489d-xz6nr\" (UID: \"a95184d8-d4d1-429e-919c-629b26c7a5f2\") " pod="openshift-route-controller-manager/route-controller-manager-65cf54489d-xz6nr" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.671099 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-276ms\" (UniqueName: \"kubernetes.io/projected/e37096ec-855c-47b4-8815-6d54a2b95c50-kube-api-access-276ms\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.671109 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e37096ec-855c-47b4-8815-6d54a2b95c50-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.671120 4965 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e37096ec-855c-47b4-8815-6d54a2b95c50-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.671128 4965 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aa2dea9b-4dc5-4223-9c11-13130a44c7d4-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.671137 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e37096ec-855c-47b4-8815-6d54a2b95c50-config\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.671226 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa2dea9b-4dc5-4223-9c11-13130a44c7d4-client-ca" (OuterVolumeSpecName: "client-ca") pod "aa2dea9b-4dc5-4223-9c11-13130a44c7d4" (UID: "aa2dea9b-4dc5-4223-9c11-13130a44c7d4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.671631 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa2dea9b-4dc5-4223-9c11-13130a44c7d4-config" (OuterVolumeSpecName: "config") pod "aa2dea9b-4dc5-4223-9c11-13130a44c7d4" (UID: "aa2dea9b-4dc5-4223-9c11-13130a44c7d4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.671972 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a95184d8-d4d1-429e-919c-629b26c7a5f2-client-ca\") pod \"route-controller-manager-65cf54489d-xz6nr\" (UID: \"a95184d8-d4d1-429e-919c-629b26c7a5f2\") " pod="openshift-route-controller-manager/route-controller-manager-65cf54489d-xz6nr" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.672424 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a95184d8-d4d1-429e-919c-629b26c7a5f2-config\") pod \"route-controller-manager-65cf54489d-xz6nr\" (UID: \"a95184d8-d4d1-429e-919c-629b26c7a5f2\") " pod="openshift-route-controller-manager/route-controller-manager-65cf54489d-xz6nr" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.673495 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa2dea9b-4dc5-4223-9c11-13130a44c7d4-kube-api-access-6dlgv" (OuterVolumeSpecName: "kube-api-access-6dlgv") pod "aa2dea9b-4dc5-4223-9c11-13130a44c7d4" (UID: "aa2dea9b-4dc5-4223-9c11-13130a44c7d4"). InnerVolumeSpecName "kube-api-access-6dlgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.673962 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa2dea9b-4dc5-4223-9c11-13130a44c7d4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "aa2dea9b-4dc5-4223-9c11-13130a44c7d4" (UID: "aa2dea9b-4dc5-4223-9c11-13130a44c7d4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.675379 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a95184d8-d4d1-429e-919c-629b26c7a5f2-serving-cert\") pod \"route-controller-manager-65cf54489d-xz6nr\" (UID: \"a95184d8-d4d1-429e-919c-629b26c7a5f2\") " pod="openshift-route-controller-manager/route-controller-manager-65cf54489d-xz6nr" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.687241 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd7km\" (UniqueName: \"kubernetes.io/projected/a95184d8-d4d1-429e-919c-629b26c7a5f2-kube-api-access-kd7km\") pod \"route-controller-manager-65cf54489d-xz6nr\" (UID: \"a95184d8-d4d1-429e-919c-629b26c7a5f2\") " pod="openshift-route-controller-manager/route-controller-manager-65cf54489d-xz6nr" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.771857 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa2dea9b-4dc5-4223-9c11-13130a44c7d4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.771892 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa2dea9b-4dc5-4223-9c11-13130a44c7d4-config\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.771904 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dlgv\" (UniqueName: \"kubernetes.io/projected/aa2dea9b-4dc5-4223-9c11-13130a44c7d4-kube-api-access-6dlgv\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.771918 4965 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aa2dea9b-4dc5-4223-9c11-13130a44c7d4-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.811870 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65cf54489d-xz6nr" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.814587 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rkz24" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.974934 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a889591-6c1d-4940-9eb2-28cfa6988f89-utilities\") pod \"7a889591-6c1d-4940-9eb2-28cfa6988f89\" (UID: \"7a889591-6c1d-4940-9eb2-28cfa6988f89\") " Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.974995 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a889591-6c1d-4940-9eb2-28cfa6988f89-catalog-content\") pod \"7a889591-6c1d-4940-9eb2-28cfa6988f89\" (UID: \"7a889591-6c1d-4940-9eb2-28cfa6988f89\") " Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.975035 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvjm4\" (UniqueName: \"kubernetes.io/projected/7a889591-6c1d-4940-9eb2-28cfa6988f89-kube-api-access-pvjm4\") pod \"7a889591-6c1d-4940-9eb2-28cfa6988f89\" (UID: \"7a889591-6c1d-4940-9eb2-28cfa6988f89\") " Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.976052 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a889591-6c1d-4940-9eb2-28cfa6988f89-utilities" (OuterVolumeSpecName: "utilities") pod "7a889591-6c1d-4940-9eb2-28cfa6988f89" (UID: "7a889591-6c1d-4940-9eb2-28cfa6988f89"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.979279 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a889591-6c1d-4940-9eb2-28cfa6988f89-kube-api-access-pvjm4" (OuterVolumeSpecName: "kube-api-access-pvjm4") pod "7a889591-6c1d-4940-9eb2-28cfa6988f89" (UID: "7a889591-6c1d-4940-9eb2-28cfa6988f89"). InnerVolumeSpecName "kube-api-access-pvjm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.980254 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvjm4\" (UniqueName: \"kubernetes.io/projected/7a889591-6c1d-4940-9eb2-28cfa6988f89-kube-api-access-pvjm4\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.980287 4965 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a889591-6c1d-4940-9eb2-28cfa6988f89-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:30 crc kubenswrapper[4965]: I0318 11:59:30.984575 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65cf54489d-xz6nr"] Mar 18 11:59:30 crc kubenswrapper[4965]: W0318 11:59:30.993403 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda95184d8_d4d1_429e_919c_629b26c7a5f2.slice/crio-5cf4e84e0cda6ee375c54b249b53add1467e5f82e6ff620f5f0f05dec68af529 WatchSource:0}: Error finding container 5cf4e84e0cda6ee375c54b249b53add1467e5f82e6ff620f5f0f05dec68af529: Status 404 returned error can't find the container with id 5cf4e84e0cda6ee375c54b249b53add1467e5f82e6ff620f5f0f05dec68af529 Mar 18 11:59:31 crc kubenswrapper[4965]: I0318 11:59:31.002887 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a889591-6c1d-4940-9eb2-28cfa6988f89-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a889591-6c1d-4940-9eb2-28cfa6988f89" (UID: "7a889591-6c1d-4940-9eb2-28cfa6988f89"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:59:31 crc kubenswrapper[4965]: I0318 11:59:31.081242 4965 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a889591-6c1d-4940-9eb2-28cfa6988f89-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:31 crc kubenswrapper[4965]: I0318 11:59:31.138043 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65cf54489d-xz6nr" event={"ID":"a95184d8-d4d1-429e-919c-629b26c7a5f2","Type":"ContainerStarted","Data":"2f855f2f727a1a0c4a23fd322ef912930206f8a6ba01226f9c6c736f358e2755"} Mar 18 11:59:31 crc kubenswrapper[4965]: I0318 11:59:31.138110 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65cf54489d-xz6nr" event={"ID":"a95184d8-d4d1-429e-919c-629b26c7a5f2","Type":"ContainerStarted","Data":"5cf4e84e0cda6ee375c54b249b53add1467e5f82e6ff620f5f0f05dec68af529"} Mar 18 11:59:31 crc kubenswrapper[4965]: I0318 11:59:31.138137 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-65cf54489d-xz6nr" Mar 18 11:59:31 crc kubenswrapper[4965]: I0318 11:59:31.141484 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c59d97bd4-8kwdx" event={"ID":"aa2dea9b-4dc5-4223-9c11-13130a44c7d4","Type":"ContainerDied","Data":"7fa63888579720b78109d270f7d4644f965f94e3049f73b6dc6f60afa354b227"} Mar 18 11:59:31 crc kubenswrapper[4965]: I0318 11:59:31.141545 4965 scope.go:117] "RemoveContainer" containerID="2cbe35b7326e966f07dc3dec53832d3f906e8b92f9e07aab3e3be530bc030f88" Mar 18 11:59:31 crc kubenswrapper[4965]: I0318 11:59:31.141576 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c59d97bd4-8kwdx" Mar 18 11:59:31 crc kubenswrapper[4965]: I0318 11:59:31.141743 4965 patch_prober.go:28] interesting pod/route-controller-manager-65cf54489d-xz6nr container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.63:8443/healthz\": dial tcp 10.217.0.63:8443: connect: connection refused" start-of-body= Mar 18 11:59:31 crc kubenswrapper[4965]: I0318 11:59:31.141809 4965 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-65cf54489d-xz6nr" podUID="a95184d8-d4d1-429e-919c-629b26c7a5f2" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.63:8443/healthz\": dial tcp 10.217.0.63:8443: connect: connection refused" Mar 18 11:59:31 crc kubenswrapper[4965]: I0318 11:59:31.154895 4965 generic.go:334] "Generic (PLEG): container finished" podID="7a889591-6c1d-4940-9eb2-28cfa6988f89" containerID="fc45ce8c7c6bc50001b3f22b7386e5929de97d9356ab7aaff516a11765d521dd" exitCode=0 Mar 18 11:59:31 crc kubenswrapper[4965]: I0318 11:59:31.155017 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rkz24" event={"ID":"7a889591-6c1d-4940-9eb2-28cfa6988f89","Type":"ContainerDied","Data":"fc45ce8c7c6bc50001b3f22b7386e5929de97d9356ab7aaff516a11765d521dd"} Mar 18 11:59:31 crc kubenswrapper[4965]: I0318 11:59:31.155058 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rkz24" event={"ID":"7a889591-6c1d-4940-9eb2-28cfa6988f89","Type":"ContainerDied","Data":"832507c8698dd2e5cfe95eab0c602496a320ad1df64238be63be6e136360019b"} Mar 18 11:59:31 crc kubenswrapper[4965]: I0318 11:59:31.155073 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rkz24" Mar 18 11:59:31 crc kubenswrapper[4965]: I0318 11:59:31.162897 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67944cbc48-46jc2" event={"ID":"e37096ec-855c-47b4-8815-6d54a2b95c50","Type":"ContainerDied","Data":"9aa87e61365b541a4fef414f88d37b0291e353b2512ea1a23e238225626b4735"} Mar 18 11:59:31 crc kubenswrapper[4965]: I0318 11:59:31.162944 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67944cbc48-46jc2" Mar 18 11:59:31 crc kubenswrapper[4965]: I0318 11:59:31.164248 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-65cf54489d-xz6nr" podStartSLOduration=3.164220994 podStartE2EDuration="3.164220994s" podCreationTimestamp="2026-03-18 11:59:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:59:31.161385101 +0000 UTC m=+176.147572600" watchObservedRunningTime="2026-03-18 11:59:31.164220994 +0000 UTC m=+176.150408483" Mar 18 11:59:31 crc kubenswrapper[4965]: I0318 11:59:31.167780 4965 scope.go:117] "RemoveContainer" containerID="fc45ce8c7c6bc50001b3f22b7386e5929de97d9356ab7aaff516a11765d521dd" Mar 18 11:59:31 crc kubenswrapper[4965]: I0318 11:59:31.189161 4965 scope.go:117] "RemoveContainer" containerID="bca7a5ae0df35e73e261fbcd671a32dc151ddb187c3745e8d013aa4ac3727703" Mar 18 11:59:31 crc kubenswrapper[4965]: I0318 11:59:31.189399 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c59d97bd4-8kwdx"] Mar 18 11:59:31 crc kubenswrapper[4965]: I0318 11:59:31.192940 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6c59d97bd4-8kwdx"] Mar 18 11:59:31 crc kubenswrapper[4965]: I0318 11:59:31.210256 4965 scope.go:117] "RemoveContainer" containerID="f38f7b6adeff507a29063839e67aa83cfb446d1623090d1f78bee27fb36a5b1b" Mar 18 11:59:31 crc kubenswrapper[4965]: I0318 11:59:31.222609 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67944cbc48-46jc2"] Mar 18 11:59:31 crc kubenswrapper[4965]: I0318 11:59:31.227690 4965 scope.go:117] "RemoveContainer" containerID="fc45ce8c7c6bc50001b3f22b7386e5929de97d9356ab7aaff516a11765d521dd" Mar 18 11:59:31 crc kubenswrapper[4965]: E0318 11:59:31.227982 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc45ce8c7c6bc50001b3f22b7386e5929de97d9356ab7aaff516a11765d521dd\": container with ID starting with fc45ce8c7c6bc50001b3f22b7386e5929de97d9356ab7aaff516a11765d521dd not found: ID does not exist" containerID="fc45ce8c7c6bc50001b3f22b7386e5929de97d9356ab7aaff516a11765d521dd" Mar 18 11:59:31 crc kubenswrapper[4965]: I0318 11:59:31.228035 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc45ce8c7c6bc50001b3f22b7386e5929de97d9356ab7aaff516a11765d521dd"} err="failed to get container status \"fc45ce8c7c6bc50001b3f22b7386e5929de97d9356ab7aaff516a11765d521dd\": rpc error: code = NotFound desc = could not find container \"fc45ce8c7c6bc50001b3f22b7386e5929de97d9356ab7aaff516a11765d521dd\": container with ID starting with fc45ce8c7c6bc50001b3f22b7386e5929de97d9356ab7aaff516a11765d521dd not found: ID does not exist" Mar 18 11:59:31 crc kubenswrapper[4965]: I0318 11:59:31.228057 4965 scope.go:117] "RemoveContainer" containerID="bca7a5ae0df35e73e261fbcd671a32dc151ddb187c3745e8d013aa4ac3727703" Mar 18 11:59:31 crc kubenswrapper[4965]: E0318 11:59:31.228257 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bca7a5ae0df35e73e261fbcd671a32dc151ddb187c3745e8d013aa4ac3727703\": container with ID starting with bca7a5ae0df35e73e261fbcd671a32dc151ddb187c3745e8d013aa4ac3727703 not found: ID does not exist" containerID="bca7a5ae0df35e73e261fbcd671a32dc151ddb187c3745e8d013aa4ac3727703" Mar 18 11:59:31 crc kubenswrapper[4965]: I0318 11:59:31.228284 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bca7a5ae0df35e73e261fbcd671a32dc151ddb187c3745e8d013aa4ac3727703"} err="failed to get container status \"bca7a5ae0df35e73e261fbcd671a32dc151ddb187c3745e8d013aa4ac3727703\": rpc error: code = NotFound desc = could not find container \"bca7a5ae0df35e73e261fbcd671a32dc151ddb187c3745e8d013aa4ac3727703\": container with ID starting with bca7a5ae0df35e73e261fbcd671a32dc151ddb187c3745e8d013aa4ac3727703 not found: ID does not exist" Mar 18 11:59:31 crc kubenswrapper[4965]: I0318 11:59:31.228299 4965 scope.go:117] "RemoveContainer" containerID="f38f7b6adeff507a29063839e67aa83cfb446d1623090d1f78bee27fb36a5b1b" Mar 18 11:59:31 crc kubenswrapper[4965]: E0318 11:59:31.228501 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f38f7b6adeff507a29063839e67aa83cfb446d1623090d1f78bee27fb36a5b1b\": container with ID starting with f38f7b6adeff507a29063839e67aa83cfb446d1623090d1f78bee27fb36a5b1b not found: ID does not exist" containerID="f38f7b6adeff507a29063839e67aa83cfb446d1623090d1f78bee27fb36a5b1b" Mar 18 11:59:31 crc kubenswrapper[4965]: I0318 11:59:31.228529 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f38f7b6adeff507a29063839e67aa83cfb446d1623090d1f78bee27fb36a5b1b"} err="failed to get container status \"f38f7b6adeff507a29063839e67aa83cfb446d1623090d1f78bee27fb36a5b1b\": rpc error: code = NotFound desc = could not find container \"f38f7b6adeff507a29063839e67aa83cfb446d1623090d1f78bee27fb36a5b1b\": container with ID starting with f38f7b6adeff507a29063839e67aa83cfb446d1623090d1f78bee27fb36a5b1b not found: ID does not exist" Mar 18 11:59:31 crc kubenswrapper[4965]: I0318 11:59:31.228553 4965 scope.go:117] "RemoveContainer" containerID="38cc7a8e788b31c6831a1d32642b0b1896569df44a123a2cbe6efd94bb385f1d" Mar 18 11:59:31 crc kubenswrapper[4965]: I0318 11:59:31.233775 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67944cbc48-46jc2"] Mar 18 11:59:31 crc kubenswrapper[4965]: I0318 11:59:31.238201 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rkz24"] Mar 18 11:59:31 crc kubenswrapper[4965]: I0318 11:59:31.243083 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rkz24"] Mar 18 11:59:32 crc kubenswrapper[4965]: I0318 11:59:32.029834 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a889591-6c1d-4940-9eb2-28cfa6988f89" path="/var/lib/kubelet/pods/7a889591-6c1d-4940-9eb2-28cfa6988f89/volumes" Mar 18 11:59:32 crc kubenswrapper[4965]: I0318 11:59:32.031008 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa2dea9b-4dc5-4223-9c11-13130a44c7d4" path="/var/lib/kubelet/pods/aa2dea9b-4dc5-4223-9c11-13130a44c7d4/volumes" Mar 18 11:59:32 crc kubenswrapper[4965]: I0318 11:59:32.031480 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e37096ec-855c-47b4-8815-6d54a2b95c50" path="/var/lib/kubelet/pods/e37096ec-855c-47b4-8815-6d54a2b95c50/volumes" Mar 18 11:59:32 crc kubenswrapper[4965]: I0318 11:59:32.087284 4965 ???:1] "http: TLS handshake error from 192.168.126.11:41476: no serving certificate available for the kubelet" Mar 18 11:59:32 crc kubenswrapper[4965]: I0318 11:59:32.179881 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-65cf54489d-xz6nr" Mar 18 11:59:32 crc kubenswrapper[4965]: I0318 11:59:32.745982 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5wjlx" Mar 18 11:59:32 crc kubenswrapper[4965]: I0318 11:59:32.746426 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5wjlx" Mar 18 11:59:32 crc kubenswrapper[4965]: I0318 11:59:32.808775 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5wjlx" Mar 18 11:59:33 crc kubenswrapper[4965]: I0318 11:59:33.247556 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5wjlx" Mar 18 11:59:33 crc kubenswrapper[4965]: I0318 11:59:33.326862 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7cbcbf47c-lb9gd"] Mar 18 11:59:33 crc kubenswrapper[4965]: E0318 11:59:33.327154 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a889591-6c1d-4940-9eb2-28cfa6988f89" containerName="extract-content" Mar 18 11:59:33 crc kubenswrapper[4965]: I0318 11:59:33.327169 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a889591-6c1d-4940-9eb2-28cfa6988f89" containerName="extract-content" Mar 18 11:59:33 crc kubenswrapper[4965]: E0318 11:59:33.327185 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a889591-6c1d-4940-9eb2-28cfa6988f89" containerName="registry-server" Mar 18 11:59:33 crc kubenswrapper[4965]: I0318 11:59:33.327195 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a889591-6c1d-4940-9eb2-28cfa6988f89" containerName="registry-server" Mar 18 11:59:33 crc kubenswrapper[4965]: E0318 11:59:33.327210 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a889591-6c1d-4940-9eb2-28cfa6988f89" containerName="extract-utilities" Mar 18 11:59:33 crc kubenswrapper[4965]: I0318 11:59:33.327220 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a889591-6c1d-4940-9eb2-28cfa6988f89" containerName="extract-utilities" Mar 18 11:59:33 crc kubenswrapper[4965]: E0318 11:59:33.327228 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa2dea9b-4dc5-4223-9c11-13130a44c7d4" containerName="controller-manager" Mar 18 11:59:33 crc kubenswrapper[4965]: I0318 11:59:33.327235 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa2dea9b-4dc5-4223-9c11-13130a44c7d4" containerName="controller-manager" Mar 18 11:59:33 crc kubenswrapper[4965]: I0318 11:59:33.327366 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a889591-6c1d-4940-9eb2-28cfa6988f89" containerName="registry-server" Mar 18 11:59:33 crc kubenswrapper[4965]: I0318 11:59:33.327381 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa2dea9b-4dc5-4223-9c11-13130a44c7d4" containerName="controller-manager" Mar 18 11:59:33 crc kubenswrapper[4965]: I0318 11:59:33.327858 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cbcbf47c-lb9gd" Mar 18 11:59:33 crc kubenswrapper[4965]: I0318 11:59:33.330826 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 11:59:33 crc kubenswrapper[4965]: I0318 11:59:33.330844 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 11:59:33 crc kubenswrapper[4965]: I0318 11:59:33.331745 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 11:59:33 crc kubenswrapper[4965]: I0318 11:59:33.331796 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 11:59:33 crc kubenswrapper[4965]: I0318 11:59:33.331947 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 11:59:33 crc kubenswrapper[4965]: I0318 11:59:33.332129 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 11:59:33 crc kubenswrapper[4965]: I0318 11:59:33.340906 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7cbcbf47c-lb9gd"] Mar 18 11:59:33 crc kubenswrapper[4965]: I0318 11:59:33.346461 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 11:59:33 crc kubenswrapper[4965]: I0318 11:59:33.449874 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6pk8b" Mar 18 11:59:33 crc kubenswrapper[4965]: I0318 11:59:33.449959 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6pk8b" Mar 18 11:59:33 crc kubenswrapper[4965]: I0318 11:59:33.494349 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6pk8b" Mar 18 11:59:33 crc kubenswrapper[4965]: I0318 11:59:33.520562 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4h7l\" (UniqueName: \"kubernetes.io/projected/ef57bcc8-773d-4bcf-bb47-b335b44596af-kube-api-access-h4h7l\") pod \"controller-manager-7cbcbf47c-lb9gd\" (UID: \"ef57bcc8-773d-4bcf-bb47-b335b44596af\") " pod="openshift-controller-manager/controller-manager-7cbcbf47c-lb9gd" Mar 18 11:59:33 crc kubenswrapper[4965]: I0318 11:59:33.520639 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef57bcc8-773d-4bcf-bb47-b335b44596af-proxy-ca-bundles\") pod \"controller-manager-7cbcbf47c-lb9gd\" (UID: \"ef57bcc8-773d-4bcf-bb47-b335b44596af\") " pod="openshift-controller-manager/controller-manager-7cbcbf47c-lb9gd" Mar 18 11:59:33 crc kubenswrapper[4965]: I0318 11:59:33.520677 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef57bcc8-773d-4bcf-bb47-b335b44596af-config\") pod \"controller-manager-7cbcbf47c-lb9gd\" (UID: \"ef57bcc8-773d-4bcf-bb47-b335b44596af\") " pod="openshift-controller-manager/controller-manager-7cbcbf47c-lb9gd" Mar 18 11:59:33 crc kubenswrapper[4965]: I0318 11:59:33.520710 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef57bcc8-773d-4bcf-bb47-b335b44596af-client-ca\") pod \"controller-manager-7cbcbf47c-lb9gd\" (UID: \"ef57bcc8-773d-4bcf-bb47-b335b44596af\") " pod="openshift-controller-manager/controller-manager-7cbcbf47c-lb9gd" Mar 18 11:59:33 crc kubenswrapper[4965]: I0318 11:59:33.520726 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef57bcc8-773d-4bcf-bb47-b335b44596af-serving-cert\") pod \"controller-manager-7cbcbf47c-lb9gd\" (UID: \"ef57bcc8-773d-4bcf-bb47-b335b44596af\") " pod="openshift-controller-manager/controller-manager-7cbcbf47c-lb9gd" Mar 18 11:59:33 crc kubenswrapper[4965]: I0318 11:59:33.621947 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef57bcc8-773d-4bcf-bb47-b335b44596af-client-ca\") pod \"controller-manager-7cbcbf47c-lb9gd\" (UID: \"ef57bcc8-773d-4bcf-bb47-b335b44596af\") " pod="openshift-controller-manager/controller-manager-7cbcbf47c-lb9gd" Mar 18 11:59:33 crc kubenswrapper[4965]: I0318 11:59:33.622043 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef57bcc8-773d-4bcf-bb47-b335b44596af-serving-cert\") pod \"controller-manager-7cbcbf47c-lb9gd\" (UID: \"ef57bcc8-773d-4bcf-bb47-b335b44596af\") " pod="openshift-controller-manager/controller-manager-7cbcbf47c-lb9gd" Mar 18 11:59:33 crc kubenswrapper[4965]: I0318 11:59:33.622176 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4h7l\" (UniqueName: \"kubernetes.io/projected/ef57bcc8-773d-4bcf-bb47-b335b44596af-kube-api-access-h4h7l\") pod \"controller-manager-7cbcbf47c-lb9gd\" (UID: \"ef57bcc8-773d-4bcf-bb47-b335b44596af\") " pod="openshift-controller-manager/controller-manager-7cbcbf47c-lb9gd" Mar 18 11:59:33 crc kubenswrapper[4965]: I0318 11:59:33.622326 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef57bcc8-773d-4bcf-bb47-b335b44596af-proxy-ca-bundles\") pod \"controller-manager-7cbcbf47c-lb9gd\" (UID: \"ef57bcc8-773d-4bcf-bb47-b335b44596af\") " pod="openshift-controller-manager/controller-manager-7cbcbf47c-lb9gd" Mar 18 11:59:33 crc kubenswrapper[4965]: I0318 11:59:33.622388 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef57bcc8-773d-4bcf-bb47-b335b44596af-config\") pod \"controller-manager-7cbcbf47c-lb9gd\" (UID: \"ef57bcc8-773d-4bcf-bb47-b335b44596af\") " pod="openshift-controller-manager/controller-manager-7cbcbf47c-lb9gd" Mar 18 11:59:33 crc kubenswrapper[4965]: I0318 11:59:33.623516 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef57bcc8-773d-4bcf-bb47-b335b44596af-client-ca\") pod \"controller-manager-7cbcbf47c-lb9gd\" (UID: \"ef57bcc8-773d-4bcf-bb47-b335b44596af\") " pod="openshift-controller-manager/controller-manager-7cbcbf47c-lb9gd" Mar 18 11:59:33 crc kubenswrapper[4965]: I0318 11:59:33.624779 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef57bcc8-773d-4bcf-bb47-b335b44596af-proxy-ca-bundles\") pod \"controller-manager-7cbcbf47c-lb9gd\" (UID: \"ef57bcc8-773d-4bcf-bb47-b335b44596af\") " pod="openshift-controller-manager/controller-manager-7cbcbf47c-lb9gd" Mar 18 11:59:33 crc kubenswrapper[4965]: I0318 11:59:33.624985 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef57bcc8-773d-4bcf-bb47-b335b44596af-config\") pod \"controller-manager-7cbcbf47c-lb9gd\" (UID: \"ef57bcc8-773d-4bcf-bb47-b335b44596af\") " pod="openshift-controller-manager/controller-manager-7cbcbf47c-lb9gd" Mar 18 11:59:33 crc kubenswrapper[4965]: I0318 11:59:33.632945 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef57bcc8-773d-4bcf-bb47-b335b44596af-serving-cert\") pod \"controller-manager-7cbcbf47c-lb9gd\" (UID: \"ef57bcc8-773d-4bcf-bb47-b335b44596af\") " pod="openshift-controller-manager/controller-manager-7cbcbf47c-lb9gd" Mar 18 11:59:33 crc kubenswrapper[4965]: I0318 11:59:33.657216 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4h7l\" (UniqueName: \"kubernetes.io/projected/ef57bcc8-773d-4bcf-bb47-b335b44596af-kube-api-access-h4h7l\") pod \"controller-manager-7cbcbf47c-lb9gd\" (UID: \"ef57bcc8-773d-4bcf-bb47-b335b44596af\") " pod="openshift-controller-manager/controller-manager-7cbcbf47c-lb9gd" Mar 18 11:59:33 crc kubenswrapper[4965]: I0318 11:59:33.819964 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" podUID="8aa829e4-03d9-4359-9d6e-a7ce76a2072b" containerName="oauth-openshift" containerID="cri-o://4689c3919008d0d543f0fe1f20eefbade43853d2d91b298cf743a808bb1ae4a9" gracePeriod=15 Mar 18 11:59:33 crc kubenswrapper[4965]: I0318 11:59:33.952385 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cbcbf47c-lb9gd" Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.192630 4965 generic.go:334] "Generic (PLEG): container finished" podID="8aa829e4-03d9-4359-9d6e-a7ce76a2072b" containerID="4689c3919008d0d543f0fe1f20eefbade43853d2d91b298cf743a808bb1ae4a9" exitCode=0 Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.193498 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" event={"ID":"8aa829e4-03d9-4359-9d6e-a7ce76a2072b","Type":"ContainerDied","Data":"4689c3919008d0d543f0fe1f20eefbade43853d2d91b298cf743a808bb1ae4a9"} Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.239515 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.248438 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6pk8b" Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.436190 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-user-template-provider-selection\") pod \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.436266 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-system-trusted-ca-bundle\") pod \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.436306 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-system-ocp-branding-template\") pod \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.436333 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-audit-policies\") pod \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.436355 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4v994\" (UniqueName: \"kubernetes.io/projected/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-kube-api-access-4v994\") pod \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.436377 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-system-service-ca\") pod \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.436411 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-system-session\") pod \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.436462 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-system-router-certs\") pod \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.436483 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-user-idp-0-file-data\") pod \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.436508 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-user-template-error\") pod \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.436528 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-audit-dir\") pod \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.436564 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-system-serving-cert\") pod \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.436596 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-system-cliconfig\") pod \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.436618 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-user-template-login\") pod \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\" (UID: \"8aa829e4-03d9-4359-9d6e-a7ce76a2072b\") " Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.438629 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "8aa829e4-03d9-4359-9d6e-a7ce76a2072b" (UID: "8aa829e4-03d9-4359-9d6e-a7ce76a2072b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.438991 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "8aa829e4-03d9-4359-9d6e-a7ce76a2072b" (UID: "8aa829e4-03d9-4359-9d6e-a7ce76a2072b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.439084 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "8aa829e4-03d9-4359-9d6e-a7ce76a2072b" (UID: "8aa829e4-03d9-4359-9d6e-a7ce76a2072b"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.439646 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "8aa829e4-03d9-4359-9d6e-a7ce76a2072b" (UID: "8aa829e4-03d9-4359-9d6e-a7ce76a2072b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.440029 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "8aa829e4-03d9-4359-9d6e-a7ce76a2072b" (UID: "8aa829e4-03d9-4359-9d6e-a7ce76a2072b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.444243 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-kube-api-access-4v994" (OuterVolumeSpecName: "kube-api-access-4v994") pod "8aa829e4-03d9-4359-9d6e-a7ce76a2072b" (UID: "8aa829e4-03d9-4359-9d6e-a7ce76a2072b"). InnerVolumeSpecName "kube-api-access-4v994". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.444268 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "8aa829e4-03d9-4359-9d6e-a7ce76a2072b" (UID: "8aa829e4-03d9-4359-9d6e-a7ce76a2072b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.444776 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "8aa829e4-03d9-4359-9d6e-a7ce76a2072b" (UID: "8aa829e4-03d9-4359-9d6e-a7ce76a2072b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.444975 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "8aa829e4-03d9-4359-9d6e-a7ce76a2072b" (UID: "8aa829e4-03d9-4359-9d6e-a7ce76a2072b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.445706 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "8aa829e4-03d9-4359-9d6e-a7ce76a2072b" (UID: "8aa829e4-03d9-4359-9d6e-a7ce76a2072b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.446345 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "8aa829e4-03d9-4359-9d6e-a7ce76a2072b" (UID: "8aa829e4-03d9-4359-9d6e-a7ce76a2072b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.446457 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "8aa829e4-03d9-4359-9d6e-a7ce76a2072b" (UID: "8aa829e4-03d9-4359-9d6e-a7ce76a2072b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.446776 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "8aa829e4-03d9-4359-9d6e-a7ce76a2072b" (UID: "8aa829e4-03d9-4359-9d6e-a7ce76a2072b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.446827 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "8aa829e4-03d9-4359-9d6e-a7ce76a2072b" (UID: "8aa829e4-03d9-4359-9d6e-a7ce76a2072b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.447310 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7cbcbf47c-lb9gd"] Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.537769 4965 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.538061 4965 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.538072 4965 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.538084 4965 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.538095 4965 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.538103 4965 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.538113 4965 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.538122 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4v994\" (UniqueName: \"kubernetes.io/projected/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-kube-api-access-4v994\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.538132 4965 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.538140 4965 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.538149 4965 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.538157 4965 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.538166 4965 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:34 crc kubenswrapper[4965]: I0318 11:59:34.538174 4965 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8aa829e4-03d9-4359-9d6e-a7ce76a2072b-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:35 crc kubenswrapper[4965]: I0318 11:59:35.074518 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gwpmg" Mar 18 11:59:35 crc kubenswrapper[4965]: I0318 11:59:35.199487 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cbcbf47c-lb9gd" event={"ID":"ef57bcc8-773d-4bcf-bb47-b335b44596af","Type":"ContainerStarted","Data":"e28a859dc4e00ba84963568fa5ea14323a1108583c098fac093e758bc4c19ccd"} Mar 18 11:59:35 crc kubenswrapper[4965]: I0318 11:59:35.199562 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cbcbf47c-lb9gd" event={"ID":"ef57bcc8-773d-4bcf-bb47-b335b44596af","Type":"ContainerStarted","Data":"3edbb223d6fef8a85b8feb6470948d9f3046cd9885da20ab5130b8d9aa5b4455"} Mar 18 11:59:35 crc kubenswrapper[4965]: I0318 11:59:35.199649 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7cbcbf47c-lb9gd" Mar 18 11:59:35 crc kubenswrapper[4965]: I0318 11:59:35.202349 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" event={"ID":"8aa829e4-03d9-4359-9d6e-a7ce76a2072b","Type":"ContainerDied","Data":"799bd116e83b286f94d1f2d2a93bbe96850dab88e087c903c4de9d37496f53f7"} Mar 18 11:59:35 crc kubenswrapper[4965]: I0318 11:59:35.202404 4965 scope.go:117] "RemoveContainer" containerID="4689c3919008d0d543f0fe1f20eefbade43853d2d91b298cf743a808bb1ae4a9" Mar 18 11:59:35 crc kubenswrapper[4965]: I0318 11:59:35.202369 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-sk66r" Mar 18 11:59:35 crc kubenswrapper[4965]: I0318 11:59:35.206213 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7cbcbf47c-lb9gd" Mar 18 11:59:35 crc kubenswrapper[4965]: I0318 11:59:35.221468 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7cbcbf47c-lb9gd" podStartSLOduration=7.221444373 podStartE2EDuration="7.221444373s" podCreationTimestamp="2026-03-18 11:59:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:59:35.2210009 +0000 UTC m=+180.207188409" watchObservedRunningTime="2026-03-18 11:59:35.221444373 +0000 UTC m=+180.207631882" Mar 18 11:59:35 crc kubenswrapper[4965]: I0318 11:59:35.270292 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-sk66r"] Mar 18 11:59:35 crc kubenswrapper[4965]: I0318 11:59:35.275282 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-sk66r"] Mar 18 11:59:35 crc kubenswrapper[4965]: I0318 11:59:35.772473 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6pk8b"] Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.031621 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8aa829e4-03d9-4359-9d6e-a7ce76a2072b" path="/var/lib/kubelet/pods/8aa829e4-03d9-4359-9d6e-a7ce76a2072b/volumes" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.207294 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6pk8b" podUID="4329347a-487b-4012-a715-9565bc4e67d0" containerName="registry-server" containerID="cri-o://14eeb82905df737bac64135f574c93cb7221347906949fdd72c4d9a4f04a0b89" gracePeriod=2 Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.301163 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-65b7d57b79-4ggt4"] Mar 18 11:59:36 crc kubenswrapper[4965]: E0318 11:59:36.301857 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aa829e4-03d9-4359-9d6e-a7ce76a2072b" containerName="oauth-openshift" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.301878 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aa829e4-03d9-4359-9d6e-a7ce76a2072b" containerName="oauth-openshift" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.302354 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="8aa829e4-03d9-4359-9d6e-a7ce76a2072b" containerName="oauth-openshift" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.306683 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.308164 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.309278 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-65b7d57b79-4ggt4"] Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.310224 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.310293 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.310532 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.310584 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.310701 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.310904 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.311362 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.311520 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.311538 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.312426 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.312755 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.321374 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.324466 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.336203 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.464136 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7e4a7ab8-9f01-461c-9d78-8763a76a18fb-v4-0-config-user-template-error\") pod \"oauth-openshift-65b7d57b79-4ggt4\" (UID: \"7e4a7ab8-9f01-461c-9d78-8763a76a18fb\") " pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.464207 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7e4a7ab8-9f01-461c-9d78-8763a76a18fb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-65b7d57b79-4ggt4\" (UID: \"7e4a7ab8-9f01-461c-9d78-8763a76a18fb\") " pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.464252 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7e4a7ab8-9f01-461c-9d78-8763a76a18fb-v4-0-config-system-router-certs\") pod \"oauth-openshift-65b7d57b79-4ggt4\" (UID: \"7e4a7ab8-9f01-461c-9d78-8763a76a18fb\") " pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.464287 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e4a7ab8-9f01-461c-9d78-8763a76a18fb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-65b7d57b79-4ggt4\" (UID: \"7e4a7ab8-9f01-461c-9d78-8763a76a18fb\") " pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.464374 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7e4a7ab8-9f01-461c-9d78-8763a76a18fb-audit-policies\") pod \"oauth-openshift-65b7d57b79-4ggt4\" (UID: \"7e4a7ab8-9f01-461c-9d78-8763a76a18fb\") " pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.464470 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7e4a7ab8-9f01-461c-9d78-8763a76a18fb-v4-0-config-user-template-login\") pod \"oauth-openshift-65b7d57b79-4ggt4\" (UID: \"7e4a7ab8-9f01-461c-9d78-8763a76a18fb\") " pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.464522 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7e4a7ab8-9f01-461c-9d78-8763a76a18fb-audit-dir\") pod \"oauth-openshift-65b7d57b79-4ggt4\" (UID: \"7e4a7ab8-9f01-461c-9d78-8763a76a18fb\") " pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.464567 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7e4a7ab8-9f01-461c-9d78-8763a76a18fb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-65b7d57b79-4ggt4\" (UID: \"7e4a7ab8-9f01-461c-9d78-8763a76a18fb\") " pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.464607 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7e4a7ab8-9f01-461c-9d78-8763a76a18fb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-65b7d57b79-4ggt4\" (UID: \"7e4a7ab8-9f01-461c-9d78-8763a76a18fb\") " pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.464640 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7e4a7ab8-9f01-461c-9d78-8763a76a18fb-v4-0-config-system-service-ca\") pod \"oauth-openshift-65b7d57b79-4ggt4\" (UID: \"7e4a7ab8-9f01-461c-9d78-8763a76a18fb\") " pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.464704 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7e4a7ab8-9f01-461c-9d78-8763a76a18fb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-65b7d57b79-4ggt4\" (UID: \"7e4a7ab8-9f01-461c-9d78-8763a76a18fb\") " pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.464752 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7e4a7ab8-9f01-461c-9d78-8763a76a18fb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-65b7d57b79-4ggt4\" (UID: \"7e4a7ab8-9f01-461c-9d78-8763a76a18fb\") " pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.464797 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7e4a7ab8-9f01-461c-9d78-8763a76a18fb-v4-0-config-system-session\") pod \"oauth-openshift-65b7d57b79-4ggt4\" (UID: \"7e4a7ab8-9f01-461c-9d78-8763a76a18fb\") " pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.464861 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2l2r\" (UniqueName: \"kubernetes.io/projected/7e4a7ab8-9f01-461c-9d78-8763a76a18fb-kube-api-access-d2l2r\") pod \"oauth-openshift-65b7d57b79-4ggt4\" (UID: \"7e4a7ab8-9f01-461c-9d78-8763a76a18fb\") " pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.481169 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bq6m2" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.536362 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bq6m2" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.565486 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7e4a7ab8-9f01-461c-9d78-8763a76a18fb-v4-0-config-system-session\") pod \"oauth-openshift-65b7d57b79-4ggt4\" (UID: \"7e4a7ab8-9f01-461c-9d78-8763a76a18fb\") " pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.565535 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7e4a7ab8-9f01-461c-9d78-8763a76a18fb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-65b7d57b79-4ggt4\" (UID: \"7e4a7ab8-9f01-461c-9d78-8763a76a18fb\") " pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.565584 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2l2r\" (UniqueName: \"kubernetes.io/projected/7e4a7ab8-9f01-461c-9d78-8763a76a18fb-kube-api-access-d2l2r\") pod \"oauth-openshift-65b7d57b79-4ggt4\" (UID: \"7e4a7ab8-9f01-461c-9d78-8763a76a18fb\") " pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.565832 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7e4a7ab8-9f01-461c-9d78-8763a76a18fb-v4-0-config-user-template-error\") pod \"oauth-openshift-65b7d57b79-4ggt4\" (UID: \"7e4a7ab8-9f01-461c-9d78-8763a76a18fb\") " pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.565898 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7e4a7ab8-9f01-461c-9d78-8763a76a18fb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-65b7d57b79-4ggt4\" (UID: \"7e4a7ab8-9f01-461c-9d78-8763a76a18fb\") " pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.565951 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7e4a7ab8-9f01-461c-9d78-8763a76a18fb-v4-0-config-system-router-certs\") pod \"oauth-openshift-65b7d57b79-4ggt4\" (UID: \"7e4a7ab8-9f01-461c-9d78-8763a76a18fb\") " pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.565984 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e4a7ab8-9f01-461c-9d78-8763a76a18fb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-65b7d57b79-4ggt4\" (UID: \"7e4a7ab8-9f01-461c-9d78-8763a76a18fb\") " pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.566060 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7e4a7ab8-9f01-461c-9d78-8763a76a18fb-audit-policies\") pod \"oauth-openshift-65b7d57b79-4ggt4\" (UID: \"7e4a7ab8-9f01-461c-9d78-8763a76a18fb\") " pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.566134 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7e4a7ab8-9f01-461c-9d78-8763a76a18fb-audit-dir\") pod \"oauth-openshift-65b7d57b79-4ggt4\" (UID: \"7e4a7ab8-9f01-461c-9d78-8763a76a18fb\") " pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.566168 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7e4a7ab8-9f01-461c-9d78-8763a76a18fb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-65b7d57b79-4ggt4\" (UID: \"7e4a7ab8-9f01-461c-9d78-8763a76a18fb\") " pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.566190 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7e4a7ab8-9f01-461c-9d78-8763a76a18fb-v4-0-config-user-template-login\") pod \"oauth-openshift-65b7d57b79-4ggt4\" (UID: \"7e4a7ab8-9f01-461c-9d78-8763a76a18fb\") " pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.566222 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7e4a7ab8-9f01-461c-9d78-8763a76a18fb-v4-0-config-system-service-ca\") pod \"oauth-openshift-65b7d57b79-4ggt4\" (UID: \"7e4a7ab8-9f01-461c-9d78-8763a76a18fb\") " pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.566326 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7e4a7ab8-9f01-461c-9d78-8763a76a18fb-audit-dir\") pod \"oauth-openshift-65b7d57b79-4ggt4\" (UID: \"7e4a7ab8-9f01-461c-9d78-8763a76a18fb\") " pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.567190 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7e4a7ab8-9f01-461c-9d78-8763a76a18fb-audit-policies\") pod \"oauth-openshift-65b7d57b79-4ggt4\" (UID: \"7e4a7ab8-9f01-461c-9d78-8763a76a18fb\") " pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.567329 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7e4a7ab8-9f01-461c-9d78-8763a76a18fb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-65b7d57b79-4ggt4\" (UID: \"7e4a7ab8-9f01-461c-9d78-8763a76a18fb\") " pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.567537 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e4a7ab8-9f01-461c-9d78-8763a76a18fb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-65b7d57b79-4ggt4\" (UID: \"7e4a7ab8-9f01-461c-9d78-8763a76a18fb\") " pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.567625 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7e4a7ab8-9f01-461c-9d78-8763a76a18fb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-65b7d57b79-4ggt4\" (UID: \"7e4a7ab8-9f01-461c-9d78-8763a76a18fb\") " pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.567784 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7e4a7ab8-9f01-461c-9d78-8763a76a18fb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-65b7d57b79-4ggt4\" (UID: \"7e4a7ab8-9f01-461c-9d78-8763a76a18fb\") " pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.569131 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7e4a7ab8-9f01-461c-9d78-8763a76a18fb-v4-0-config-system-service-ca\") pod \"oauth-openshift-65b7d57b79-4ggt4\" (UID: \"7e4a7ab8-9f01-461c-9d78-8763a76a18fb\") " pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.570986 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7e4a7ab8-9f01-461c-9d78-8763a76a18fb-v4-0-config-system-session\") pod \"oauth-openshift-65b7d57b79-4ggt4\" (UID: \"7e4a7ab8-9f01-461c-9d78-8763a76a18fb\") " pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.571469 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7e4a7ab8-9f01-461c-9d78-8763a76a18fb-v4-0-config-user-template-error\") pod \"oauth-openshift-65b7d57b79-4ggt4\" (UID: \"7e4a7ab8-9f01-461c-9d78-8763a76a18fb\") " pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.572145 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7e4a7ab8-9f01-461c-9d78-8763a76a18fb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-65b7d57b79-4ggt4\" (UID: \"7e4a7ab8-9f01-461c-9d78-8763a76a18fb\") " pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.572668 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7e4a7ab8-9f01-461c-9d78-8763a76a18fb-v4-0-config-user-template-login\") pod \"oauth-openshift-65b7d57b79-4ggt4\" (UID: \"7e4a7ab8-9f01-461c-9d78-8763a76a18fb\") " pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.572769 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7e4a7ab8-9f01-461c-9d78-8763a76a18fb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-65b7d57b79-4ggt4\" (UID: \"7e4a7ab8-9f01-461c-9d78-8763a76a18fb\") " pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.573628 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7e4a7ab8-9f01-461c-9d78-8763a76a18fb-v4-0-config-system-router-certs\") pod \"oauth-openshift-65b7d57b79-4ggt4\" (UID: \"7e4a7ab8-9f01-461c-9d78-8763a76a18fb\") " pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.574038 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7e4a7ab8-9f01-461c-9d78-8763a76a18fb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-65b7d57b79-4ggt4\" (UID: \"7e4a7ab8-9f01-461c-9d78-8763a76a18fb\") " pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.574900 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7e4a7ab8-9f01-461c-9d78-8763a76a18fb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-65b7d57b79-4ggt4\" (UID: \"7e4a7ab8-9f01-461c-9d78-8763a76a18fb\") " pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.585720 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2l2r\" (UniqueName: \"kubernetes.io/projected/7e4a7ab8-9f01-461c-9d78-8763a76a18fb-kube-api-access-d2l2r\") pod \"oauth-openshift-65b7d57b79-4ggt4\" (UID: \"7e4a7ab8-9f01-461c-9d78-8763a76a18fb\") " pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.620269 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6pk8b" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.644223 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.769536 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4329347a-487b-4012-a715-9565bc4e67d0-utilities\") pod \"4329347a-487b-4012-a715-9565bc4e67d0\" (UID: \"4329347a-487b-4012-a715-9565bc4e67d0\") " Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.769803 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4329347a-487b-4012-a715-9565bc4e67d0-catalog-content\") pod \"4329347a-487b-4012-a715-9565bc4e67d0\" (UID: \"4329347a-487b-4012-a715-9565bc4e67d0\") " Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.769839 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfc57\" (UniqueName: \"kubernetes.io/projected/4329347a-487b-4012-a715-9565bc4e67d0-kube-api-access-tfc57\") pod \"4329347a-487b-4012-a715-9565bc4e67d0\" (UID: \"4329347a-487b-4012-a715-9565bc4e67d0\") " Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.770504 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4329347a-487b-4012-a715-9565bc4e67d0-utilities" (OuterVolumeSpecName: "utilities") pod "4329347a-487b-4012-a715-9565bc4e67d0" (UID: "4329347a-487b-4012-a715-9565bc4e67d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.774820 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4329347a-487b-4012-a715-9565bc4e67d0-kube-api-access-tfc57" (OuterVolumeSpecName: "kube-api-access-tfc57") pod "4329347a-487b-4012-a715-9565bc4e67d0" (UID: "4329347a-487b-4012-a715-9565bc4e67d0"). InnerVolumeSpecName "kube-api-access-tfc57". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.818010 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4329347a-487b-4012-a715-9565bc4e67d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4329347a-487b-4012-a715-9565bc4e67d0" (UID: "4329347a-487b-4012-a715-9565bc4e67d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.871631 4965 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4329347a-487b-4012-a715-9565bc4e67d0-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.871722 4965 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4329347a-487b-4012-a715-9565bc4e67d0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:36 crc kubenswrapper[4965]: I0318 11:59:36.871748 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfc57\" (UniqueName: \"kubernetes.io/projected/4329347a-487b-4012-a715-9565bc4e67d0-kube-api-access-tfc57\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:37 crc kubenswrapper[4965]: I0318 11:59:37.021562 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-65b7d57b79-4ggt4"] Mar 18 11:59:37 crc kubenswrapper[4965]: W0318 11:59:37.027468 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e4a7ab8_9f01_461c_9d78_8763a76a18fb.slice/crio-07a99a3a356c517527b55618c306b23e9fc760ac90dadacb247d2cec2e1970ed WatchSource:0}: Error finding container 07a99a3a356c517527b55618c306b23e9fc760ac90dadacb247d2cec2e1970ed: Status 404 returned error can't find the container with id 07a99a3a356c517527b55618c306b23e9fc760ac90dadacb247d2cec2e1970ed Mar 18 11:59:37 crc kubenswrapper[4965]: I0318 11:59:37.217131 4965 generic.go:334] "Generic (PLEG): container finished" podID="4329347a-487b-4012-a715-9565bc4e67d0" containerID="14eeb82905df737bac64135f574c93cb7221347906949fdd72c4d9a4f04a0b89" exitCode=0 Mar 18 11:59:37 crc kubenswrapper[4965]: I0318 11:59:37.217186 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6pk8b" Mar 18 11:59:37 crc kubenswrapper[4965]: I0318 11:59:37.217302 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6pk8b" event={"ID":"4329347a-487b-4012-a715-9565bc4e67d0","Type":"ContainerDied","Data":"14eeb82905df737bac64135f574c93cb7221347906949fdd72c4d9a4f04a0b89"} Mar 18 11:59:37 crc kubenswrapper[4965]: I0318 11:59:37.217482 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6pk8b" event={"ID":"4329347a-487b-4012-a715-9565bc4e67d0","Type":"ContainerDied","Data":"2f144bfbd2b49c1599ec7dd0ad239255137becc433f7c5447678e433e9fcbab6"} Mar 18 11:59:37 crc kubenswrapper[4965]: I0318 11:59:37.217621 4965 scope.go:117] "RemoveContainer" containerID="14eeb82905df737bac64135f574c93cb7221347906949fdd72c4d9a4f04a0b89" Mar 18 11:59:37 crc kubenswrapper[4965]: I0318 11:59:37.221995 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" event={"ID":"7e4a7ab8-9f01-461c-9d78-8763a76a18fb","Type":"ContainerStarted","Data":"07a99a3a356c517527b55618c306b23e9fc760ac90dadacb247d2cec2e1970ed"} Mar 18 11:59:37 crc kubenswrapper[4965]: I0318 11:59:37.251842 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6pk8b"] Mar 18 11:59:37 crc kubenswrapper[4965]: I0318 11:59:37.252895 4965 scope.go:117] "RemoveContainer" containerID="d475773f27c577265c846fe389b488c2e03ce76830a459d5a269c4731d31a6df" Mar 18 11:59:37 crc kubenswrapper[4965]: I0318 11:59:37.255290 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6pk8b"] Mar 18 11:59:37 crc kubenswrapper[4965]: I0318 11:59:37.277683 4965 scope.go:117] "RemoveContainer" containerID="2ef6855b2bdc0363f98390332d696682272c51322e7a695dbd3a452f00e4e09b" Mar 18 11:59:37 crc kubenswrapper[4965]: I0318 11:59:37.292594 4965 scope.go:117] "RemoveContainer" containerID="14eeb82905df737bac64135f574c93cb7221347906949fdd72c4d9a4f04a0b89" Mar 18 11:59:37 crc kubenswrapper[4965]: E0318 11:59:37.292996 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14eeb82905df737bac64135f574c93cb7221347906949fdd72c4d9a4f04a0b89\": container with ID starting with 14eeb82905df737bac64135f574c93cb7221347906949fdd72c4d9a4f04a0b89 not found: ID does not exist" containerID="14eeb82905df737bac64135f574c93cb7221347906949fdd72c4d9a4f04a0b89" Mar 18 11:59:37 crc kubenswrapper[4965]: I0318 11:59:37.293046 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14eeb82905df737bac64135f574c93cb7221347906949fdd72c4d9a4f04a0b89"} err="failed to get container status \"14eeb82905df737bac64135f574c93cb7221347906949fdd72c4d9a4f04a0b89\": rpc error: code = NotFound desc = could not find container \"14eeb82905df737bac64135f574c93cb7221347906949fdd72c4d9a4f04a0b89\": container with ID starting with 14eeb82905df737bac64135f574c93cb7221347906949fdd72c4d9a4f04a0b89 not found: ID does not exist" Mar 18 11:59:37 crc kubenswrapper[4965]: I0318 11:59:37.293079 4965 scope.go:117] "RemoveContainer" containerID="d475773f27c577265c846fe389b488c2e03ce76830a459d5a269c4731d31a6df" Mar 18 11:59:37 crc kubenswrapper[4965]: E0318 11:59:37.293372 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d475773f27c577265c846fe389b488c2e03ce76830a459d5a269c4731d31a6df\": container with ID starting with d475773f27c577265c846fe389b488c2e03ce76830a459d5a269c4731d31a6df not found: ID does not exist" containerID="d475773f27c577265c846fe389b488c2e03ce76830a459d5a269c4731d31a6df" Mar 18 11:59:37 crc kubenswrapper[4965]: I0318 11:59:37.293409 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d475773f27c577265c846fe389b488c2e03ce76830a459d5a269c4731d31a6df"} err="failed to get container status \"d475773f27c577265c846fe389b488c2e03ce76830a459d5a269c4731d31a6df\": rpc error: code = NotFound desc = could not find container \"d475773f27c577265c846fe389b488c2e03ce76830a459d5a269c4731d31a6df\": container with ID starting with d475773f27c577265c846fe389b488c2e03ce76830a459d5a269c4731d31a6df not found: ID does not exist" Mar 18 11:59:37 crc kubenswrapper[4965]: I0318 11:59:37.293434 4965 scope.go:117] "RemoveContainer" containerID="2ef6855b2bdc0363f98390332d696682272c51322e7a695dbd3a452f00e4e09b" Mar 18 11:59:37 crc kubenswrapper[4965]: E0318 11:59:37.293888 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ef6855b2bdc0363f98390332d696682272c51322e7a695dbd3a452f00e4e09b\": container with ID starting with 2ef6855b2bdc0363f98390332d696682272c51322e7a695dbd3a452f00e4e09b not found: ID does not exist" containerID="2ef6855b2bdc0363f98390332d696682272c51322e7a695dbd3a452f00e4e09b" Mar 18 11:59:37 crc kubenswrapper[4965]: I0318 11:59:37.293920 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ef6855b2bdc0363f98390332d696682272c51322e7a695dbd3a452f00e4e09b"} err="failed to get container status \"2ef6855b2bdc0363f98390332d696682272c51322e7a695dbd3a452f00e4e09b\": rpc error: code = NotFound desc = could not find container \"2ef6855b2bdc0363f98390332d696682272c51322e7a695dbd3a452f00e4e09b\": container with ID starting with 2ef6855b2bdc0363f98390332d696682272c51322e7a695dbd3a452f00e4e09b not found: ID does not exist" Mar 18 11:59:38 crc kubenswrapper[4965]: I0318 11:59:38.034198 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4329347a-487b-4012-a715-9565bc4e67d0" path="/var/lib/kubelet/pods/4329347a-487b-4012-a715-9565bc4e67d0/volumes" Mar 18 11:59:38 crc kubenswrapper[4965]: I0318 11:59:38.232111 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" event={"ID":"7e4a7ab8-9f01-461c-9d78-8763a76a18fb","Type":"ContainerStarted","Data":"45b3f3b0d8f73264c6653bfacfc67b8701da2566395ba8cd93f5bc0d5e90cd7d"} Mar 18 11:59:38 crc kubenswrapper[4965]: I0318 11:59:38.232594 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:38 crc kubenswrapper[4965]: I0318 11:59:38.241433 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" Mar 18 11:59:38 crc kubenswrapper[4965]: I0318 11:59:38.261093 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-65b7d57b79-4ggt4" podStartSLOduration=30.261068342 podStartE2EDuration="30.261068342s" podCreationTimestamp="2026-03-18 11:59:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:59:38.259713643 +0000 UTC m=+183.245901202" watchObservedRunningTime="2026-03-18 11:59:38.261068342 +0000 UTC m=+183.247255851" Mar 18 11:59:39 crc kubenswrapper[4965]: I0318 11:59:39.972839 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bq6m2"] Mar 18 11:59:39 crc kubenswrapper[4965]: I0318 11:59:39.973045 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bq6m2" podUID="6c7d53d7-b4fc-48d5-b8f4-545ea972697e" containerName="registry-server" containerID="cri-o://271f75a53c1cf419b7e3cfc294577dad2d7a027a42854facf90df16ff6a273c8" gracePeriod=2 Mar 18 11:59:40 crc kubenswrapper[4965]: I0318 11:59:40.243573 4965 generic.go:334] "Generic (PLEG): container finished" podID="6c7d53d7-b4fc-48d5-b8f4-545ea972697e" containerID="271f75a53c1cf419b7e3cfc294577dad2d7a027a42854facf90df16ff6a273c8" exitCode=0 Mar 18 11:59:40 crc kubenswrapper[4965]: I0318 11:59:40.243685 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bq6m2" event={"ID":"6c7d53d7-b4fc-48d5-b8f4-545ea972697e","Type":"ContainerDied","Data":"271f75a53c1cf419b7e3cfc294577dad2d7a027a42854facf90df16ff6a273c8"} Mar 18 11:59:40 crc kubenswrapper[4965]: I0318 11:59:40.405486 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bq6m2" Mar 18 11:59:40 crc kubenswrapper[4965]: I0318 11:59:40.515853 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c7d53d7-b4fc-48d5-b8f4-545ea972697e-utilities\") pod \"6c7d53d7-b4fc-48d5-b8f4-545ea972697e\" (UID: \"6c7d53d7-b4fc-48d5-b8f4-545ea972697e\") " Mar 18 11:59:40 crc kubenswrapper[4965]: I0318 11:59:40.515998 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hx2l2\" (UniqueName: \"kubernetes.io/projected/6c7d53d7-b4fc-48d5-b8f4-545ea972697e-kube-api-access-hx2l2\") pod \"6c7d53d7-b4fc-48d5-b8f4-545ea972697e\" (UID: \"6c7d53d7-b4fc-48d5-b8f4-545ea972697e\") " Mar 18 11:59:40 crc kubenswrapper[4965]: I0318 11:59:40.516029 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c7d53d7-b4fc-48d5-b8f4-545ea972697e-catalog-content\") pod \"6c7d53d7-b4fc-48d5-b8f4-545ea972697e\" (UID: \"6c7d53d7-b4fc-48d5-b8f4-545ea972697e\") " Mar 18 11:59:40 crc kubenswrapper[4965]: I0318 11:59:40.517219 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c7d53d7-b4fc-48d5-b8f4-545ea972697e-utilities" (OuterVolumeSpecName: "utilities") pod "6c7d53d7-b4fc-48d5-b8f4-545ea972697e" (UID: "6c7d53d7-b4fc-48d5-b8f4-545ea972697e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:59:40 crc kubenswrapper[4965]: I0318 11:59:40.525731 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c7d53d7-b4fc-48d5-b8f4-545ea972697e-kube-api-access-hx2l2" (OuterVolumeSpecName: "kube-api-access-hx2l2") pod "6c7d53d7-b4fc-48d5-b8f4-545ea972697e" (UID: "6c7d53d7-b4fc-48d5-b8f4-545ea972697e"). InnerVolumeSpecName "kube-api-access-hx2l2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:59:40 crc kubenswrapper[4965]: I0318 11:59:40.617540 4965 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c7d53d7-b4fc-48d5-b8f4-545ea972697e-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:40 crc kubenswrapper[4965]: I0318 11:59:40.617599 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hx2l2\" (UniqueName: \"kubernetes.io/projected/6c7d53d7-b4fc-48d5-b8f4-545ea972697e-kube-api-access-hx2l2\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:40 crc kubenswrapper[4965]: I0318 11:59:40.642794 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c7d53d7-b4fc-48d5-b8f4-545ea972697e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6c7d53d7-b4fc-48d5-b8f4-545ea972697e" (UID: "6c7d53d7-b4fc-48d5-b8f4-545ea972697e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:59:40 crc kubenswrapper[4965]: I0318 11:59:40.719634 4965 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c7d53d7-b4fc-48d5-b8f4-545ea972697e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:41 crc kubenswrapper[4965]: I0318 11:59:41.255436 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bq6m2" event={"ID":"6c7d53d7-b4fc-48d5-b8f4-545ea972697e","Type":"ContainerDied","Data":"5aff5bb0025284ff396efdc2e6d96cddd21b628119bdc55a0148aab0566e4147"} Mar 18 11:59:41 crc kubenswrapper[4965]: I0318 11:59:41.255524 4965 scope.go:117] "RemoveContainer" containerID="271f75a53c1cf419b7e3cfc294577dad2d7a027a42854facf90df16ff6a273c8" Mar 18 11:59:41 crc kubenswrapper[4965]: I0318 11:59:41.255545 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bq6m2" Mar 18 11:59:41 crc kubenswrapper[4965]: I0318 11:59:41.289363 4965 scope.go:117] "RemoveContainer" containerID="9261b312524f789032b5d10f99b55317718d7d02ad982aa7199d7f77d2e70d43" Mar 18 11:59:41 crc kubenswrapper[4965]: I0318 11:59:41.301708 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bq6m2"] Mar 18 11:59:41 crc kubenswrapper[4965]: I0318 11:59:41.311595 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bq6m2"] Mar 18 11:59:41 crc kubenswrapper[4965]: I0318 11:59:41.329143 4965 scope.go:117] "RemoveContainer" containerID="b5b6aebee14d591b0ff69b46c82bba28db042ce8a323508fa411429f5bc2ed92" Mar 18 11:59:42 crc kubenswrapper[4965]: I0318 11:59:42.028279 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c7d53d7-b4fc-48d5-b8f4-545ea972697e" path="/var/lib/kubelet/pods/6c7d53d7-b4fc-48d5-b8f4-545ea972697e/volumes" Mar 18 11:59:48 crc kubenswrapper[4965]: I0318 11:59:48.156811 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7cbcbf47c-lb9gd"] Mar 18 11:59:48 crc kubenswrapper[4965]: I0318 11:59:48.157604 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7cbcbf47c-lb9gd" podUID="ef57bcc8-773d-4bcf-bb47-b335b44596af" containerName="controller-manager" containerID="cri-o://e28a859dc4e00ba84963568fa5ea14323a1108583c098fac093e758bc4c19ccd" gracePeriod=30 Mar 18 11:59:48 crc kubenswrapper[4965]: I0318 11:59:48.254104 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65cf54489d-xz6nr"] Mar 18 11:59:48 crc kubenswrapper[4965]: I0318 11:59:48.254318 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-65cf54489d-xz6nr" podUID="a95184d8-d4d1-429e-919c-629b26c7a5f2" containerName="route-controller-manager" containerID="cri-o://2f855f2f727a1a0c4a23fd322ef912930206f8a6ba01226f9c6c736f358e2755" gracePeriod=30 Mar 18 11:59:48 crc kubenswrapper[4965]: I0318 11:59:48.298850 4965 generic.go:334] "Generic (PLEG): container finished" podID="ef57bcc8-773d-4bcf-bb47-b335b44596af" containerID="e28a859dc4e00ba84963568fa5ea14323a1108583c098fac093e758bc4c19ccd" exitCode=0 Mar 18 11:59:48 crc kubenswrapper[4965]: I0318 11:59:48.298908 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cbcbf47c-lb9gd" event={"ID":"ef57bcc8-773d-4bcf-bb47-b335b44596af","Type":"ContainerDied","Data":"e28a859dc4e00ba84963568fa5ea14323a1108583c098fac093e758bc4c19ccd"} Mar 18 11:59:48 crc kubenswrapper[4965]: I0318 11:59:48.741110 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65cf54489d-xz6nr" Mar 18 11:59:48 crc kubenswrapper[4965]: I0318 11:59:48.747904 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cbcbf47c-lb9gd" Mar 18 11:59:48 crc kubenswrapper[4965]: I0318 11:59:48.939918 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef57bcc8-773d-4bcf-bb47-b335b44596af-client-ca\") pod \"ef57bcc8-773d-4bcf-bb47-b335b44596af\" (UID: \"ef57bcc8-773d-4bcf-bb47-b335b44596af\") " Mar 18 11:59:48 crc kubenswrapper[4965]: I0318 11:59:48.939958 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a95184d8-d4d1-429e-919c-629b26c7a5f2-config\") pod \"a95184d8-d4d1-429e-919c-629b26c7a5f2\" (UID: \"a95184d8-d4d1-429e-919c-629b26c7a5f2\") " Mar 18 11:59:48 crc kubenswrapper[4965]: I0318 11:59:48.939981 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a95184d8-d4d1-429e-919c-629b26c7a5f2-serving-cert\") pod \"a95184d8-d4d1-429e-919c-629b26c7a5f2\" (UID: \"a95184d8-d4d1-429e-919c-629b26c7a5f2\") " Mar 18 11:59:48 crc kubenswrapper[4965]: I0318 11:59:48.940038 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef57bcc8-773d-4bcf-bb47-b335b44596af-serving-cert\") pod \"ef57bcc8-773d-4bcf-bb47-b335b44596af\" (UID: \"ef57bcc8-773d-4bcf-bb47-b335b44596af\") " Mar 18 11:59:48 crc kubenswrapper[4965]: I0318 11:59:48.940059 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef57bcc8-773d-4bcf-bb47-b335b44596af-config\") pod \"ef57bcc8-773d-4bcf-bb47-b335b44596af\" (UID: \"ef57bcc8-773d-4bcf-bb47-b335b44596af\") " Mar 18 11:59:48 crc kubenswrapper[4965]: I0318 11:59:48.940075 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4h7l\" (UniqueName: \"kubernetes.io/projected/ef57bcc8-773d-4bcf-bb47-b335b44596af-kube-api-access-h4h7l\") pod \"ef57bcc8-773d-4bcf-bb47-b335b44596af\" (UID: \"ef57bcc8-773d-4bcf-bb47-b335b44596af\") " Mar 18 11:59:48 crc kubenswrapper[4965]: I0318 11:59:48.940091 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef57bcc8-773d-4bcf-bb47-b335b44596af-proxy-ca-bundles\") pod \"ef57bcc8-773d-4bcf-bb47-b335b44596af\" (UID: \"ef57bcc8-773d-4bcf-bb47-b335b44596af\") " Mar 18 11:59:48 crc kubenswrapper[4965]: I0318 11:59:48.940118 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kd7km\" (UniqueName: \"kubernetes.io/projected/a95184d8-d4d1-429e-919c-629b26c7a5f2-kube-api-access-kd7km\") pod \"a95184d8-d4d1-429e-919c-629b26c7a5f2\" (UID: \"a95184d8-d4d1-429e-919c-629b26c7a5f2\") " Mar 18 11:59:48 crc kubenswrapper[4965]: I0318 11:59:48.940139 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a95184d8-d4d1-429e-919c-629b26c7a5f2-client-ca\") pod \"a95184d8-d4d1-429e-919c-629b26c7a5f2\" (UID: \"a95184d8-d4d1-429e-919c-629b26c7a5f2\") " Mar 18 11:59:48 crc kubenswrapper[4965]: I0318 11:59:48.940689 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a95184d8-d4d1-429e-919c-629b26c7a5f2-client-ca" (OuterVolumeSpecName: "client-ca") pod "a95184d8-d4d1-429e-919c-629b26c7a5f2" (UID: "a95184d8-d4d1-429e-919c-629b26c7a5f2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:59:48 crc kubenswrapper[4965]: I0318 11:59:48.940701 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef57bcc8-773d-4bcf-bb47-b335b44596af-client-ca" (OuterVolumeSpecName: "client-ca") pod "ef57bcc8-773d-4bcf-bb47-b335b44596af" (UID: "ef57bcc8-773d-4bcf-bb47-b335b44596af"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:59:48 crc kubenswrapper[4965]: I0318 11:59:48.941057 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a95184d8-d4d1-429e-919c-629b26c7a5f2-config" (OuterVolumeSpecName: "config") pod "a95184d8-d4d1-429e-919c-629b26c7a5f2" (UID: "a95184d8-d4d1-429e-919c-629b26c7a5f2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:59:48 crc kubenswrapper[4965]: I0318 11:59:48.941816 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef57bcc8-773d-4bcf-bb47-b335b44596af-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ef57bcc8-773d-4bcf-bb47-b335b44596af" (UID: "ef57bcc8-773d-4bcf-bb47-b335b44596af"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:59:48 crc kubenswrapper[4965]: I0318 11:59:48.941911 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef57bcc8-773d-4bcf-bb47-b335b44596af-config" (OuterVolumeSpecName: "config") pod "ef57bcc8-773d-4bcf-bb47-b335b44596af" (UID: "ef57bcc8-773d-4bcf-bb47-b335b44596af"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:59:48 crc kubenswrapper[4965]: I0318 11:59:48.945166 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef57bcc8-773d-4bcf-bb47-b335b44596af-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ef57bcc8-773d-4bcf-bb47-b335b44596af" (UID: "ef57bcc8-773d-4bcf-bb47-b335b44596af"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:59:48 crc kubenswrapper[4965]: I0318 11:59:48.945273 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a95184d8-d4d1-429e-919c-629b26c7a5f2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a95184d8-d4d1-429e-919c-629b26c7a5f2" (UID: "a95184d8-d4d1-429e-919c-629b26c7a5f2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:59:48 crc kubenswrapper[4965]: I0318 11:59:48.946229 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef57bcc8-773d-4bcf-bb47-b335b44596af-kube-api-access-h4h7l" (OuterVolumeSpecName: "kube-api-access-h4h7l") pod "ef57bcc8-773d-4bcf-bb47-b335b44596af" (UID: "ef57bcc8-773d-4bcf-bb47-b335b44596af"). InnerVolumeSpecName "kube-api-access-h4h7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:59:48 crc kubenswrapper[4965]: I0318 11:59:48.946416 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a95184d8-d4d1-429e-919c-629b26c7a5f2-kube-api-access-kd7km" (OuterVolumeSpecName: "kube-api-access-kd7km") pod "a95184d8-d4d1-429e-919c-629b26c7a5f2" (UID: "a95184d8-d4d1-429e-919c-629b26c7a5f2"). InnerVolumeSpecName "kube-api-access-kd7km". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.041285 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef57bcc8-773d-4bcf-bb47-b335b44596af-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.041331 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef57bcc8-773d-4bcf-bb47-b335b44596af-config\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.041341 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4h7l\" (UniqueName: \"kubernetes.io/projected/ef57bcc8-773d-4bcf-bb47-b335b44596af-kube-api-access-h4h7l\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.041351 4965 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef57bcc8-773d-4bcf-bb47-b335b44596af-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.041362 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kd7km\" (UniqueName: \"kubernetes.io/projected/a95184d8-d4d1-429e-919c-629b26c7a5f2-kube-api-access-kd7km\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.041370 4965 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a95184d8-d4d1-429e-919c-629b26c7a5f2-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.041378 4965 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef57bcc8-773d-4bcf-bb47-b335b44596af-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.041385 4965 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a95184d8-d4d1-429e-919c-629b26c7a5f2-config\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.041393 4965 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a95184d8-d4d1-429e-919c-629b26c7a5f2-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.304836 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5695bc7c8b-98j2p"] Mar 18 11:59:49 crc kubenswrapper[4965]: E0318 11:59:49.305216 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c7d53d7-b4fc-48d5-b8f4-545ea972697e" containerName="registry-server" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.305232 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c7d53d7-b4fc-48d5-b8f4-545ea972697e" containerName="registry-server" Mar 18 11:59:49 crc kubenswrapper[4965]: E0318 11:59:49.305248 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c7d53d7-b4fc-48d5-b8f4-545ea972697e" containerName="extract-content" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.305256 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c7d53d7-b4fc-48d5-b8f4-545ea972697e" containerName="extract-content" Mar 18 11:59:49 crc kubenswrapper[4965]: E0318 11:59:49.305269 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4329347a-487b-4012-a715-9565bc4e67d0" containerName="extract-content" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.305276 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="4329347a-487b-4012-a715-9565bc4e67d0" containerName="extract-content" Mar 18 11:59:49 crc kubenswrapper[4965]: E0318 11:59:49.305287 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4329347a-487b-4012-a715-9565bc4e67d0" containerName="registry-server" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.305294 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="4329347a-487b-4012-a715-9565bc4e67d0" containerName="registry-server" Mar 18 11:59:49 crc kubenswrapper[4965]: E0318 11:59:49.305302 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4329347a-487b-4012-a715-9565bc4e67d0" containerName="extract-utilities" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.305309 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="4329347a-487b-4012-a715-9565bc4e67d0" containerName="extract-utilities" Mar 18 11:59:49 crc kubenswrapper[4965]: E0318 11:59:49.305322 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef57bcc8-773d-4bcf-bb47-b335b44596af" containerName="controller-manager" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.305329 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef57bcc8-773d-4bcf-bb47-b335b44596af" containerName="controller-manager" Mar 18 11:59:49 crc kubenswrapper[4965]: E0318 11:59:49.305341 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a95184d8-d4d1-429e-919c-629b26c7a5f2" containerName="route-controller-manager" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.305347 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="a95184d8-d4d1-429e-919c-629b26c7a5f2" containerName="route-controller-manager" Mar 18 11:59:49 crc kubenswrapper[4965]: E0318 11:59:49.305357 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c7d53d7-b4fc-48d5-b8f4-545ea972697e" containerName="extract-utilities" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.305365 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c7d53d7-b4fc-48d5-b8f4-545ea972697e" containerName="extract-utilities" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.305499 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="4329347a-487b-4012-a715-9565bc4e67d0" containerName="registry-server" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.305510 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef57bcc8-773d-4bcf-bb47-b335b44596af" containerName="controller-manager" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.305525 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c7d53d7-b4fc-48d5-b8f4-545ea972697e" containerName="registry-server" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.305535 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="a95184d8-d4d1-429e-919c-629b26c7a5f2" containerName="route-controller-manager" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.306061 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5695bc7c8b-98j2p" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.308075 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cbcbf47c-lb9gd" event={"ID":"ef57bcc8-773d-4bcf-bb47-b335b44596af","Type":"ContainerDied","Data":"3edbb223d6fef8a85b8feb6470948d9f3046cd9885da20ab5130b8d9aa5b4455"} Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.308153 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cbcbf47c-lb9gd" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.308160 4965 scope.go:117] "RemoveContainer" containerID="e28a859dc4e00ba84963568fa5ea14323a1108583c098fac093e758bc4c19ccd" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.312197 4965 generic.go:334] "Generic (PLEG): container finished" podID="a95184d8-d4d1-429e-919c-629b26c7a5f2" containerID="2f855f2f727a1a0c4a23fd322ef912930206f8a6ba01226f9c6c736f358e2755" exitCode=0 Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.312224 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65cf54489d-xz6nr" event={"ID":"a95184d8-d4d1-429e-919c-629b26c7a5f2","Type":"ContainerDied","Data":"2f855f2f727a1a0c4a23fd322ef912930206f8a6ba01226f9c6c736f358e2755"} Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.312248 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65cf54489d-xz6nr" event={"ID":"a95184d8-d4d1-429e-919c-629b26c7a5f2","Type":"ContainerDied","Data":"5cf4e84e0cda6ee375c54b249b53add1467e5f82e6ff620f5f0f05dec68af529"} Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.312294 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65cf54489d-xz6nr" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.317866 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6697dcf79c-pm6f6"] Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.318721 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6697dcf79c-pm6f6" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.323979 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6697dcf79c-pm6f6"] Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.324788 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.327092 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.327479 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.327851 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.327960 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.328287 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.328589 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5695bc7c8b-98j2p"] Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.333701 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.335973 4965 scope.go:117] "RemoveContainer" containerID="2f855f2f727a1a0c4a23fd322ef912930206f8a6ba01226f9c6c736f358e2755" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.353296 4965 scope.go:117] "RemoveContainer" containerID="2f855f2f727a1a0c4a23fd322ef912930206f8a6ba01226f9c6c736f358e2755" Mar 18 11:59:49 crc kubenswrapper[4965]: E0318 11:59:49.357553 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f855f2f727a1a0c4a23fd322ef912930206f8a6ba01226f9c6c736f358e2755\": container with ID starting with 2f855f2f727a1a0c4a23fd322ef912930206f8a6ba01226f9c6c736f358e2755 not found: ID does not exist" containerID="2f855f2f727a1a0c4a23fd322ef912930206f8a6ba01226f9c6c736f358e2755" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.357605 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f855f2f727a1a0c4a23fd322ef912930206f8a6ba01226f9c6c736f358e2755"} err="failed to get container status \"2f855f2f727a1a0c4a23fd322ef912930206f8a6ba01226f9c6c736f358e2755\": rpc error: code = NotFound desc = could not find container \"2f855f2f727a1a0c4a23fd322ef912930206f8a6ba01226f9c6c736f358e2755\": container with ID starting with 2f855f2f727a1a0c4a23fd322ef912930206f8a6ba01226f9c6c736f358e2755 not found: ID does not exist" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.357901 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/51600f53-0871-4cb0-9579-6cd2f6938c4f-proxy-ca-bundles\") pod \"controller-manager-6697dcf79c-pm6f6\" (UID: \"51600f53-0871-4cb0-9579-6cd2f6938c4f\") " pod="openshift-controller-manager/controller-manager-6697dcf79c-pm6f6" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.359445 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51600f53-0871-4cb0-9579-6cd2f6938c4f-client-ca\") pod \"controller-manager-6697dcf79c-pm6f6\" (UID: \"51600f53-0871-4cb0-9579-6cd2f6938c4f\") " pod="openshift-controller-manager/controller-manager-6697dcf79c-pm6f6" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.359499 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51600f53-0871-4cb0-9579-6cd2f6938c4f-config\") pod \"controller-manager-6697dcf79c-pm6f6\" (UID: \"51600f53-0871-4cb0-9579-6cd2f6938c4f\") " pod="openshift-controller-manager/controller-manager-6697dcf79c-pm6f6" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.359538 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51600f53-0871-4cb0-9579-6cd2f6938c4f-serving-cert\") pod \"controller-manager-6697dcf79c-pm6f6\" (UID: \"51600f53-0871-4cb0-9579-6cd2f6938c4f\") " pod="openshift-controller-manager/controller-manager-6697dcf79c-pm6f6" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.359570 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7gb6\" (UniqueName: \"kubernetes.io/projected/51600f53-0871-4cb0-9579-6cd2f6938c4f-kube-api-access-x7gb6\") pod \"controller-manager-6697dcf79c-pm6f6\" (UID: \"51600f53-0871-4cb0-9579-6cd2f6938c4f\") " pod="openshift-controller-manager/controller-manager-6697dcf79c-pm6f6" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.373099 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7cbcbf47c-lb9gd"] Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.381323 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7cbcbf47c-lb9gd"] Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.395202 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65cf54489d-xz6nr"] Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.399055 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65cf54489d-xz6nr"] Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.460122 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae6bcd65-c875-493c-a4bd-68f2c4a26a83-config\") pod \"route-controller-manager-5695bc7c8b-98j2p\" (UID: \"ae6bcd65-c875-493c-a4bd-68f2c4a26a83\") " pod="openshift-route-controller-manager/route-controller-manager-5695bc7c8b-98j2p" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.460180 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/51600f53-0871-4cb0-9579-6cd2f6938c4f-proxy-ca-bundles\") pod \"controller-manager-6697dcf79c-pm6f6\" (UID: \"51600f53-0871-4cb0-9579-6cd2f6938c4f\") " pod="openshift-controller-manager/controller-manager-6697dcf79c-pm6f6" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.460203 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdhrt\" (UniqueName: \"kubernetes.io/projected/ae6bcd65-c875-493c-a4bd-68f2c4a26a83-kube-api-access-cdhrt\") pod \"route-controller-manager-5695bc7c8b-98j2p\" (UID: \"ae6bcd65-c875-493c-a4bd-68f2c4a26a83\") " pod="openshift-route-controller-manager/route-controller-manager-5695bc7c8b-98j2p" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.460224 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae6bcd65-c875-493c-a4bd-68f2c4a26a83-serving-cert\") pod \"route-controller-manager-5695bc7c8b-98j2p\" (UID: \"ae6bcd65-c875-493c-a4bd-68f2c4a26a83\") " pod="openshift-route-controller-manager/route-controller-manager-5695bc7c8b-98j2p" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.460254 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae6bcd65-c875-493c-a4bd-68f2c4a26a83-client-ca\") pod \"route-controller-manager-5695bc7c8b-98j2p\" (UID: \"ae6bcd65-c875-493c-a4bd-68f2c4a26a83\") " pod="openshift-route-controller-manager/route-controller-manager-5695bc7c8b-98j2p" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.460368 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51600f53-0871-4cb0-9579-6cd2f6938c4f-client-ca\") pod \"controller-manager-6697dcf79c-pm6f6\" (UID: \"51600f53-0871-4cb0-9579-6cd2f6938c4f\") " pod="openshift-controller-manager/controller-manager-6697dcf79c-pm6f6" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.460458 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51600f53-0871-4cb0-9579-6cd2f6938c4f-config\") pod \"controller-manager-6697dcf79c-pm6f6\" (UID: \"51600f53-0871-4cb0-9579-6cd2f6938c4f\") " pod="openshift-controller-manager/controller-manager-6697dcf79c-pm6f6" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.460507 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51600f53-0871-4cb0-9579-6cd2f6938c4f-serving-cert\") pod \"controller-manager-6697dcf79c-pm6f6\" (UID: \"51600f53-0871-4cb0-9579-6cd2f6938c4f\") " pod="openshift-controller-manager/controller-manager-6697dcf79c-pm6f6" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.460555 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7gb6\" (UniqueName: \"kubernetes.io/projected/51600f53-0871-4cb0-9579-6cd2f6938c4f-kube-api-access-x7gb6\") pod \"controller-manager-6697dcf79c-pm6f6\" (UID: \"51600f53-0871-4cb0-9579-6cd2f6938c4f\") " pod="openshift-controller-manager/controller-manager-6697dcf79c-pm6f6" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.461343 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51600f53-0871-4cb0-9579-6cd2f6938c4f-client-ca\") pod \"controller-manager-6697dcf79c-pm6f6\" (UID: \"51600f53-0871-4cb0-9579-6cd2f6938c4f\") " pod="openshift-controller-manager/controller-manager-6697dcf79c-pm6f6" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.461382 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/51600f53-0871-4cb0-9579-6cd2f6938c4f-proxy-ca-bundles\") pod \"controller-manager-6697dcf79c-pm6f6\" (UID: \"51600f53-0871-4cb0-9579-6cd2f6938c4f\") " pod="openshift-controller-manager/controller-manager-6697dcf79c-pm6f6" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.461701 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51600f53-0871-4cb0-9579-6cd2f6938c4f-config\") pod \"controller-manager-6697dcf79c-pm6f6\" (UID: \"51600f53-0871-4cb0-9579-6cd2f6938c4f\") " pod="openshift-controller-manager/controller-manager-6697dcf79c-pm6f6" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.467886 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51600f53-0871-4cb0-9579-6cd2f6938c4f-serving-cert\") pod \"controller-manager-6697dcf79c-pm6f6\" (UID: \"51600f53-0871-4cb0-9579-6cd2f6938c4f\") " pod="openshift-controller-manager/controller-manager-6697dcf79c-pm6f6" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.477365 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7gb6\" (UniqueName: \"kubernetes.io/projected/51600f53-0871-4cb0-9579-6cd2f6938c4f-kube-api-access-x7gb6\") pod \"controller-manager-6697dcf79c-pm6f6\" (UID: \"51600f53-0871-4cb0-9579-6cd2f6938c4f\") " pod="openshift-controller-manager/controller-manager-6697dcf79c-pm6f6" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.561385 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae6bcd65-c875-493c-a4bd-68f2c4a26a83-config\") pod \"route-controller-manager-5695bc7c8b-98j2p\" (UID: \"ae6bcd65-c875-493c-a4bd-68f2c4a26a83\") " pod="openshift-route-controller-manager/route-controller-manager-5695bc7c8b-98j2p" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.561468 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdhrt\" (UniqueName: \"kubernetes.io/projected/ae6bcd65-c875-493c-a4bd-68f2c4a26a83-kube-api-access-cdhrt\") pod \"route-controller-manager-5695bc7c8b-98j2p\" (UID: \"ae6bcd65-c875-493c-a4bd-68f2c4a26a83\") " pod="openshift-route-controller-manager/route-controller-manager-5695bc7c8b-98j2p" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.561504 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae6bcd65-c875-493c-a4bd-68f2c4a26a83-serving-cert\") pod \"route-controller-manager-5695bc7c8b-98j2p\" (UID: \"ae6bcd65-c875-493c-a4bd-68f2c4a26a83\") " pod="openshift-route-controller-manager/route-controller-manager-5695bc7c8b-98j2p" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.561546 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae6bcd65-c875-493c-a4bd-68f2c4a26a83-client-ca\") pod \"route-controller-manager-5695bc7c8b-98j2p\" (UID: \"ae6bcd65-c875-493c-a4bd-68f2c4a26a83\") " pod="openshift-route-controller-manager/route-controller-manager-5695bc7c8b-98j2p" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.562754 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae6bcd65-c875-493c-a4bd-68f2c4a26a83-config\") pod \"route-controller-manager-5695bc7c8b-98j2p\" (UID: \"ae6bcd65-c875-493c-a4bd-68f2c4a26a83\") " pod="openshift-route-controller-manager/route-controller-manager-5695bc7c8b-98j2p" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.562769 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae6bcd65-c875-493c-a4bd-68f2c4a26a83-client-ca\") pod \"route-controller-manager-5695bc7c8b-98j2p\" (UID: \"ae6bcd65-c875-493c-a4bd-68f2c4a26a83\") " pod="openshift-route-controller-manager/route-controller-manager-5695bc7c8b-98j2p" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.566015 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae6bcd65-c875-493c-a4bd-68f2c4a26a83-serving-cert\") pod \"route-controller-manager-5695bc7c8b-98j2p\" (UID: \"ae6bcd65-c875-493c-a4bd-68f2c4a26a83\") " pod="openshift-route-controller-manager/route-controller-manager-5695bc7c8b-98j2p" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.579460 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdhrt\" (UniqueName: \"kubernetes.io/projected/ae6bcd65-c875-493c-a4bd-68f2c4a26a83-kube-api-access-cdhrt\") pod \"route-controller-manager-5695bc7c8b-98j2p\" (UID: \"ae6bcd65-c875-493c-a4bd-68f2c4a26a83\") " pod="openshift-route-controller-manager/route-controller-manager-5695bc7c8b-98j2p" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.657128 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5695bc7c8b-98j2p" Mar 18 11:59:49 crc kubenswrapper[4965]: I0318 11:59:49.661388 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6697dcf79c-pm6f6" Mar 18 11:59:50 crc kubenswrapper[4965]: I0318 11:59:50.027381 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a95184d8-d4d1-429e-919c-629b26c7a5f2" path="/var/lib/kubelet/pods/a95184d8-d4d1-429e-919c-629b26c7a5f2/volumes" Mar 18 11:59:50 crc kubenswrapper[4965]: I0318 11:59:50.028269 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef57bcc8-773d-4bcf-bb47-b335b44596af" path="/var/lib/kubelet/pods/ef57bcc8-773d-4bcf-bb47-b335b44596af/volumes" Mar 18 11:59:50 crc kubenswrapper[4965]: I0318 11:59:50.085842 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5695bc7c8b-98j2p"] Mar 18 11:59:50 crc kubenswrapper[4965]: W0318 11:59:50.093024 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae6bcd65_c875_493c_a4bd_68f2c4a26a83.slice/crio-e047b37ba44072b6a8d17db5f201f3467cf863bf912cccd6aa57643ac1250b75 WatchSource:0}: Error finding container e047b37ba44072b6a8d17db5f201f3467cf863bf912cccd6aa57643ac1250b75: Status 404 returned error can't find the container with id e047b37ba44072b6a8d17db5f201f3467cf863bf912cccd6aa57643ac1250b75 Mar 18 11:59:50 crc kubenswrapper[4965]: I0318 11:59:50.138355 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6697dcf79c-pm6f6"] Mar 18 11:59:50 crc kubenswrapper[4965]: W0318 11:59:50.149732 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51600f53_0871_4cb0_9579_6cd2f6938c4f.slice/crio-81677878c19b674797aa60060445da1c395c94ba51cca94c75b710de37229acc WatchSource:0}: Error finding container 81677878c19b674797aa60060445da1c395c94ba51cca94c75b710de37229acc: Status 404 returned error can't find the container with id 81677878c19b674797aa60060445da1c395c94ba51cca94c75b710de37229acc Mar 18 11:59:50 crc kubenswrapper[4965]: I0318 11:59:50.319580 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5695bc7c8b-98j2p" event={"ID":"ae6bcd65-c875-493c-a4bd-68f2c4a26a83","Type":"ContainerStarted","Data":"c29237816e0aff842ae6d5f5a5483d61f96f6ecf371fa85d5eb4e0a08cf1dcbb"} Mar 18 11:59:50 crc kubenswrapper[4965]: I0318 11:59:50.319637 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5695bc7c8b-98j2p" event={"ID":"ae6bcd65-c875-493c-a4bd-68f2c4a26a83","Type":"ContainerStarted","Data":"e047b37ba44072b6a8d17db5f201f3467cf863bf912cccd6aa57643ac1250b75"} Mar 18 11:59:50 crc kubenswrapper[4965]: I0318 11:59:50.319771 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5695bc7c8b-98j2p" Mar 18 11:59:50 crc kubenswrapper[4965]: I0318 11:59:50.323350 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6697dcf79c-pm6f6" event={"ID":"51600f53-0871-4cb0-9579-6cd2f6938c4f","Type":"ContainerStarted","Data":"1cf45fbfd5d78551cad1cf9f044ecbaf03d6b7842bb0646b497497d4815cad47"} Mar 18 11:59:50 crc kubenswrapper[4965]: I0318 11:59:50.323399 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6697dcf79c-pm6f6" event={"ID":"51600f53-0871-4cb0-9579-6cd2f6938c4f","Type":"ContainerStarted","Data":"81677878c19b674797aa60060445da1c395c94ba51cca94c75b710de37229acc"} Mar 18 11:59:50 crc kubenswrapper[4965]: I0318 11:59:50.360260 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5695bc7c8b-98j2p" podStartSLOduration=2.360240657 podStartE2EDuration="2.360240657s" podCreationTimestamp="2026-03-18 11:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:59:50.344985008 +0000 UTC m=+195.331172487" watchObservedRunningTime="2026-03-18 11:59:50.360240657 +0000 UTC m=+195.346428136" Mar 18 11:59:50 crc kubenswrapper[4965]: I0318 11:59:50.362894 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6697dcf79c-pm6f6" podStartSLOduration=2.362880187 podStartE2EDuration="2.362880187s" podCreationTimestamp="2026-03-18 11:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:59:50.36163899 +0000 UTC m=+195.347826469" watchObservedRunningTime="2026-03-18 11:59:50.362880187 +0000 UTC m=+195.349067666" Mar 18 11:59:50 crc kubenswrapper[4965]: I0318 11:59:50.488022 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5695bc7c8b-98j2p" Mar 18 11:59:51 crc kubenswrapper[4965]: I0318 11:59:51.329438 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6697dcf79c-pm6f6" Mar 18 11:59:51 crc kubenswrapper[4965]: I0318 11:59:51.340772 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6697dcf79c-pm6f6" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.027532 4965 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.027800 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://1489cfbddc1072ee2238c82e981b8dd676b935c235de66cb53613a2479018238" gracePeriod=15 Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.027832 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://35983903bdfe8136c0d115201f4bfb23009d1cb8019ea7f9b647614eb8b27afe" gracePeriod=15 Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.027861 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://90185fb5121b4cb1ffd94f24cce668838aa69c89a757de56246a7e9e3254005a" gracePeriod=15 Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.027825 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://e6be9268fda0866f34859b014813011535f476e4a3004010d537f23df27440f1" gracePeriod=15 Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.027888 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://3c1605c2a78731623f65967e7b4af0bc71ba4b55c952a45bd2b093a59ac425a8" gracePeriod=15 Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.028192 4965 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 11:59:54 crc kubenswrapper[4965]: E0318 11:59:54.028342 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.028352 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 11:59:54 crc kubenswrapper[4965]: E0318 11:59:54.028364 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.028370 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 11:59:54 crc kubenswrapper[4965]: E0318 11:59:54.028381 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.028387 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 18 11:59:54 crc kubenswrapper[4965]: E0318 11:59:54.028393 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.028398 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 11:59:54 crc kubenswrapper[4965]: E0318 11:59:54.028407 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.028412 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 11:59:54 crc kubenswrapper[4965]: E0318 11:59:54.028420 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.028426 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 11:59:54 crc kubenswrapper[4965]: E0318 11:59:54.028432 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.028438 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 11:59:54 crc kubenswrapper[4965]: E0318 11:59:54.028443 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.028448 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.028538 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.028548 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.028554 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.028560 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.028570 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.028577 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.028582 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.028590 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 11:59:54 crc kubenswrapper[4965]: E0318 11:59:54.028687 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.028695 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 11:59:54 crc kubenswrapper[4965]: E0318 11:59:54.028706 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.028711 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.028803 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.029781 4965 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.030610 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.035914 4965 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Mar 18 11:59:54 crc kubenswrapper[4965]: E0318 11:59:54.065303 4965 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.138:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.118285 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.118324 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.118380 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.118404 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.118564 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.118611 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.118863 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.118987 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.220331 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.220389 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.220439 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.220467 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.220492 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.220511 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.220552 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.220559 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.220597 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.220603 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.220619 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.220639 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.220463 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.220691 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.220694 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.220715 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.349451 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.352967 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.354005 4965 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3c1605c2a78731623f65967e7b4af0bc71ba4b55c952a45bd2b093a59ac425a8" exitCode=0 Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.354036 4965 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e6be9268fda0866f34859b014813011535f476e4a3004010d537f23df27440f1" exitCode=0 Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.354045 4965 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="35983903bdfe8136c0d115201f4bfb23009d1cb8019ea7f9b647614eb8b27afe" exitCode=0 Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.354053 4965 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="90185fb5121b4cb1ffd94f24cce668838aa69c89a757de56246a7e9e3254005a" exitCode=2 Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.354166 4965 scope.go:117] "RemoveContainer" containerID="733d0d5c89687c2a3f5a4c5bed039b30b31430fe8959663bb82b04e5c4b6eb76" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.356751 4965 generic.go:334] "Generic (PLEG): container finished" podID="2f56f075-8aa8-4030-849a-ae1fc80135ef" containerID="3851d115e5176fbc2d45309d5534661867861818d77bc7e54f548c170a496872" exitCode=0 Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.356816 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"2f56f075-8aa8-4030-849a-ae1fc80135ef","Type":"ContainerDied","Data":"3851d115e5176fbc2d45309d5534661867861818d77bc7e54f548c170a496872"} Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.357686 4965 status_manager.go:851] "Failed to get status for pod" podUID="2f56f075-8aa8-4030-849a-ae1fc80135ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 18 11:59:54 crc kubenswrapper[4965]: I0318 11:59:54.367750 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 11:59:54 crc kubenswrapper[4965]: W0318 11:59:54.399047 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-79fdad8bd53179c4e75c2523684f9ee84f2364d6b7925ed0511ea01edd2253da WatchSource:0}: Error finding container 79fdad8bd53179c4e75c2523684f9ee84f2364d6b7925ed0511ea01edd2253da: Status 404 returned error can't find the container with id 79fdad8bd53179c4e75c2523684f9ee84f2364d6b7925ed0511ea01edd2253da Mar 18 11:59:54 crc kubenswrapper[4965]: E0318 11:59:54.402031 4965 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.138:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189dedb60c62a8bc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:59:54.401466556 +0000 UTC m=+199.387654035,LastTimestamp:2026-03-18 11:59:54.401466556 +0000 UTC m=+199.387654035,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:59:55 crc kubenswrapper[4965]: I0318 11:59:55.371994 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"232d60ca378e6098c064c7121784e992af17a294a74508322c3a4d5dcae4f6f7"} Mar 18 11:59:55 crc kubenswrapper[4965]: I0318 11:59:55.372348 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"79fdad8bd53179c4e75c2523684f9ee84f2364d6b7925ed0511ea01edd2253da"} Mar 18 11:59:55 crc kubenswrapper[4965]: I0318 11:59:55.373089 4965 status_manager.go:851] "Failed to get status for pod" podUID="2f56f075-8aa8-4030-849a-ae1fc80135ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 18 11:59:55 crc kubenswrapper[4965]: E0318 11:59:55.373119 4965 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.138:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 11:59:55 crc kubenswrapper[4965]: I0318 11:59:55.374727 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 11:59:55 crc kubenswrapper[4965]: I0318 11:59:55.706840 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 11:59:55 crc kubenswrapper[4965]: I0318 11:59:55.707434 4965 status_manager.go:851] "Failed to get status for pod" podUID="2f56f075-8aa8-4030-849a-ae1fc80135ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 18 11:59:55 crc kubenswrapper[4965]: I0318 11:59:55.738964 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2f56f075-8aa8-4030-849a-ae1fc80135ef-kube-api-access\") pod \"2f56f075-8aa8-4030-849a-ae1fc80135ef\" (UID: \"2f56f075-8aa8-4030-849a-ae1fc80135ef\") " Mar 18 11:59:55 crc kubenswrapper[4965]: I0318 11:59:55.739050 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2f56f075-8aa8-4030-849a-ae1fc80135ef-kubelet-dir\") pod \"2f56f075-8aa8-4030-849a-ae1fc80135ef\" (UID: \"2f56f075-8aa8-4030-849a-ae1fc80135ef\") " Mar 18 11:59:55 crc kubenswrapper[4965]: I0318 11:59:55.739083 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2f56f075-8aa8-4030-849a-ae1fc80135ef-var-lock\") pod \"2f56f075-8aa8-4030-849a-ae1fc80135ef\" (UID: \"2f56f075-8aa8-4030-849a-ae1fc80135ef\") " Mar 18 11:59:55 crc kubenswrapper[4965]: I0318 11:59:55.739281 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2f56f075-8aa8-4030-849a-ae1fc80135ef-var-lock" (OuterVolumeSpecName: "var-lock") pod "2f56f075-8aa8-4030-849a-ae1fc80135ef" (UID: "2f56f075-8aa8-4030-849a-ae1fc80135ef"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 11:59:55 crc kubenswrapper[4965]: I0318 11:59:55.739552 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2f56f075-8aa8-4030-849a-ae1fc80135ef-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "2f56f075-8aa8-4030-849a-ae1fc80135ef" (UID: "2f56f075-8aa8-4030-849a-ae1fc80135ef"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 11:59:55 crc kubenswrapper[4965]: I0318 11:59:55.745797 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f56f075-8aa8-4030-849a-ae1fc80135ef-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "2f56f075-8aa8-4030-849a-ae1fc80135ef" (UID: "2f56f075-8aa8-4030-849a-ae1fc80135ef"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:59:55 crc kubenswrapper[4965]: I0318 11:59:55.840336 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2f56f075-8aa8-4030-849a-ae1fc80135ef-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:55 crc kubenswrapper[4965]: I0318 11:59:55.840380 4965 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2f56f075-8aa8-4030-849a-ae1fc80135ef-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:55 crc kubenswrapper[4965]: I0318 11:59:55.840391 4965 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2f56f075-8aa8-4030-849a-ae1fc80135ef-var-lock\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:56 crc kubenswrapper[4965]: I0318 11:59:56.026058 4965 status_manager.go:851] "Failed to get status for pod" podUID="2f56f075-8aa8-4030-849a-ae1fc80135ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 18 11:59:56 crc kubenswrapper[4965]: E0318 11:59:56.187670 4965 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 18 11:59:56 crc kubenswrapper[4965]: E0318 11:59:56.188853 4965 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 18 11:59:56 crc kubenswrapper[4965]: E0318 11:59:56.189372 4965 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 18 11:59:56 crc kubenswrapper[4965]: E0318 11:59:56.190165 4965 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 18 11:59:56 crc kubenswrapper[4965]: E0318 11:59:56.190480 4965 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 18 11:59:56 crc kubenswrapper[4965]: I0318 11:59:56.190514 4965 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 18 11:59:56 crc kubenswrapper[4965]: E0318 11:59:56.190950 4965 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="200ms" Mar 18 11:59:56 crc kubenswrapper[4965]: I0318 11:59:56.386977 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 11:59:56 crc kubenswrapper[4965]: I0318 11:59:56.387898 4965 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1489cfbddc1072ee2238c82e981b8dd676b935c235de66cb53613a2479018238" exitCode=0 Mar 18 11:59:56 crc kubenswrapper[4965]: I0318 11:59:56.388052 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85b906892b3742a9d0b101801e8f8a821cd70f01f5b6e44cde49dde262721b9c" Mar 18 11:59:56 crc kubenswrapper[4965]: I0318 11:59:56.390060 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"2f56f075-8aa8-4030-849a-ae1fc80135ef","Type":"ContainerDied","Data":"265d94ea60971145ea8cf5dfefabe26919a9b69d16dcb2c90b292a6768ed0593"} Mar 18 11:59:56 crc kubenswrapper[4965]: I0318 11:59:56.390079 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 11:59:56 crc kubenswrapper[4965]: I0318 11:59:56.390103 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="265d94ea60971145ea8cf5dfefabe26919a9b69d16dcb2c90b292a6768ed0593" Mar 18 11:59:56 crc kubenswrapper[4965]: E0318 11:59:56.392045 4965 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="400ms" Mar 18 11:59:56 crc kubenswrapper[4965]: I0318 11:59:56.396795 4965 status_manager.go:851] "Failed to get status for pod" podUID="2f56f075-8aa8-4030-849a-ae1fc80135ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 18 11:59:56 crc kubenswrapper[4965]: I0318 11:59:56.400209 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 11:59:56 crc kubenswrapper[4965]: I0318 11:59:56.401051 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 11:59:56 crc kubenswrapper[4965]: I0318 11:59:56.401579 4965 status_manager.go:851] "Failed to get status for pod" podUID="2f56f075-8aa8-4030-849a-ae1fc80135ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 18 11:59:56 crc kubenswrapper[4965]: I0318 11:59:56.401896 4965 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 18 11:59:56 crc kubenswrapper[4965]: I0318 11:59:56.446761 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 11:59:56 crc kubenswrapper[4965]: I0318 11:59:56.446812 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 11:59:56 crc kubenswrapper[4965]: I0318 11:59:56.446904 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 11:59:56 crc kubenswrapper[4965]: I0318 11:59:56.446920 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 11:59:56 crc kubenswrapper[4965]: I0318 11:59:56.446952 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 11:59:56 crc kubenswrapper[4965]: I0318 11:59:56.446964 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 11:59:56 crc kubenswrapper[4965]: I0318 11:59:56.447428 4965 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:56 crc kubenswrapper[4965]: I0318 11:59:56.447452 4965 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:56 crc kubenswrapper[4965]: I0318 11:59:56.447476 4965 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 18 11:59:56 crc kubenswrapper[4965]: E0318 11:59:56.793883 4965 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="800ms" Mar 18 11:59:57 crc kubenswrapper[4965]: I0318 11:59:57.395161 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 11:59:57 crc kubenswrapper[4965]: I0318 11:59:57.408217 4965 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 18 11:59:57 crc kubenswrapper[4965]: I0318 11:59:57.408734 4965 status_manager.go:851] "Failed to get status for pod" podUID="2f56f075-8aa8-4030-849a-ae1fc80135ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 18 11:59:57 crc kubenswrapper[4965]: E0318 11:59:57.594960 4965 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="1.6s" Mar 18 11:59:58 crc kubenswrapper[4965]: I0318 11:59:58.030140 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 18 11:59:58 crc kubenswrapper[4965]: E0318 11:59:58.374850 4965 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.138:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189dedb60c62a8bc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 11:59:54.401466556 +0000 UTC m=+199.387654035,LastTimestamp:2026-03-18 11:59:54.401466556 +0000 UTC m=+199.387654035,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 11:59:59 crc kubenswrapper[4965]: E0318 11:59:59.196221 4965 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="3.2s" Mar 18 12:00:02 crc kubenswrapper[4965]: E0318 12:00:02.397718 4965 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="6.4s" Mar 18 12:00:06 crc kubenswrapper[4965]: I0318 12:00:06.024423 4965 status_manager.go:851] "Failed to get status for pod" podUID="2f56f075-8aa8-4030-849a-ae1fc80135ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 18 12:00:06 crc kubenswrapper[4965]: I0318 12:00:06.448357 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 12:00:06 crc kubenswrapper[4965]: I0318 12:00:06.449895 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 18 12:00:06 crc kubenswrapper[4965]: I0318 12:00:06.449968 4965 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="9dd304c138d70e2bab50741090aace5284f04e2d25fe04cf9188a91b6976f720" exitCode=1 Mar 18 12:00:06 crc kubenswrapper[4965]: I0318 12:00:06.450026 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"9dd304c138d70e2bab50741090aace5284f04e2d25fe04cf9188a91b6976f720"} Mar 18 12:00:06 crc kubenswrapper[4965]: I0318 12:00:06.450995 4965 scope.go:117] "RemoveContainer" containerID="9dd304c138d70e2bab50741090aace5284f04e2d25fe04cf9188a91b6976f720" Mar 18 12:00:06 crc kubenswrapper[4965]: I0318 12:00:06.451249 4965 status_manager.go:851] "Failed to get status for pod" podUID="2f56f075-8aa8-4030-849a-ae1fc80135ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 18 12:00:06 crc kubenswrapper[4965]: I0318 12:00:06.451997 4965 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 18 12:00:07 crc kubenswrapper[4965]: I0318 12:00:07.020376 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:00:07 crc kubenswrapper[4965]: I0318 12:00:07.021585 4965 status_manager.go:851] "Failed to get status for pod" podUID="2f56f075-8aa8-4030-849a-ae1fc80135ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 18 12:00:07 crc kubenswrapper[4965]: I0318 12:00:07.022201 4965 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 18 12:00:07 crc kubenswrapper[4965]: I0318 12:00:07.042958 4965 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7fccb5ad-ea8f-4354-82ce-8fc86b6cdfbc" Mar 18 12:00:07 crc kubenswrapper[4965]: I0318 12:00:07.043004 4965 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7fccb5ad-ea8f-4354-82ce-8fc86b6cdfbc" Mar 18 12:00:07 crc kubenswrapper[4965]: E0318 12:00:07.044221 4965 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:00:07 crc kubenswrapper[4965]: I0318 12:00:07.045762 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:00:07 crc kubenswrapper[4965]: I0318 12:00:07.467319 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 12:00:07 crc kubenswrapper[4965]: I0318 12:00:07.468202 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 18 12:00:07 crc kubenswrapper[4965]: I0318 12:00:07.468309 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ee73de6fc8900ed347a66c5c1281716c698175d4d7ba52f4424c60f8f2730af0"} Mar 18 12:00:07 crc kubenswrapper[4965]: I0318 12:00:07.469871 4965 status_manager.go:851] "Failed to get status for pod" podUID="2f56f075-8aa8-4030-849a-ae1fc80135ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 18 12:00:07 crc kubenswrapper[4965]: I0318 12:00:07.470565 4965 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 18 12:00:07 crc kubenswrapper[4965]: I0318 12:00:07.471112 4965 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="80d723af3e843229a2cc4cbe67f7d43fa02dcf01d66a1e1b40c372f13eeae82f" exitCode=0 Mar 18 12:00:07 crc kubenswrapper[4965]: I0318 12:00:07.471304 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"80d723af3e843229a2cc4cbe67f7d43fa02dcf01d66a1e1b40c372f13eeae82f"} Mar 18 12:00:07 crc kubenswrapper[4965]: I0318 12:00:07.471482 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4dd621865442c1dd44b145b6b7c758c8d7f276acffdeec06bba0d2150b2f7e21"} Mar 18 12:00:07 crc kubenswrapper[4965]: I0318 12:00:07.472025 4965 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7fccb5ad-ea8f-4354-82ce-8fc86b6cdfbc" Mar 18 12:00:07 crc kubenswrapper[4965]: I0318 12:00:07.472062 4965 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7fccb5ad-ea8f-4354-82ce-8fc86b6cdfbc" Mar 18 12:00:07 crc kubenswrapper[4965]: I0318 12:00:07.472557 4965 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 18 12:00:07 crc kubenswrapper[4965]: E0318 12:00:07.472708 4965 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:00:07 crc kubenswrapper[4965]: I0318 12:00:07.473368 4965 status_manager.go:851] "Failed to get status for pod" podUID="2f56f075-8aa8-4030-849a-ae1fc80135ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 18 12:00:08 crc kubenswrapper[4965]: I0318 12:00:08.484296 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8ae699e62757728508f49483be638ae9b5bc4af63cf311bff98eee54135ec0a5"} Mar 18 12:00:08 crc kubenswrapper[4965]: I0318 12:00:08.484580 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"16031e6059c1d3ffff3a1e25eb4baa91c5f4fa58cc27667623135824efbfb17c"} Mar 18 12:00:08 crc kubenswrapper[4965]: I0318 12:00:08.484591 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1ad022229a1a64e8978da83c150597a15704707eca424e0e81179f3e4cc12337"} Mar 18 12:00:08 crc kubenswrapper[4965]: I0318 12:00:08.484599 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"05ccb0c816d0b8140545551bfeab0f6417b6d82d000f35fe617f746b4da8afd1"} Mar 18 12:00:09 crc kubenswrapper[4965]: I0318 12:00:09.493711 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0e7fca16177cfc1c36fa6f803d86caf99b5160fb42d32ac07fa6ad7427d97591"} Mar 18 12:00:09 crc kubenswrapper[4965]: I0318 12:00:09.493998 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:00:09 crc kubenswrapper[4965]: I0318 12:00:09.494033 4965 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7fccb5ad-ea8f-4354-82ce-8fc86b6cdfbc" Mar 18 12:00:09 crc kubenswrapper[4965]: I0318 12:00:09.494051 4965 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7fccb5ad-ea8f-4354-82ce-8fc86b6cdfbc" Mar 18 12:00:11 crc kubenswrapper[4965]: I0318 12:00:11.068210 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:00:11 crc kubenswrapper[4965]: I0318 12:00:11.069203 4965 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 18 12:00:11 crc kubenswrapper[4965]: I0318 12:00:11.069380 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 18 12:00:11 crc kubenswrapper[4965]: I0318 12:00:11.602193 4965 patch_prober.go:28] interesting pod/machine-config-daemon-8l67n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:00:11 crc kubenswrapper[4965]: I0318 12:00:11.602280 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8l67n" podUID="e9a53215-1d0d-47de-92c4-cea3209fe4fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:00:12 crc kubenswrapper[4965]: I0318 12:00:12.046624 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:00:12 crc kubenswrapper[4965]: I0318 12:00:12.046722 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:00:12 crc kubenswrapper[4965]: I0318 12:00:12.052396 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:00:14 crc kubenswrapper[4965]: I0318 12:00:14.502461 4965 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:00:14 crc kubenswrapper[4965]: I0318 12:00:14.521270 4965 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7fccb5ad-ea8f-4354-82ce-8fc86b6cdfbc" Mar 18 12:00:14 crc kubenswrapper[4965]: I0318 12:00:14.521297 4965 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7fccb5ad-ea8f-4354-82ce-8fc86b6cdfbc" Mar 18 12:00:14 crc kubenswrapper[4965]: I0318 12:00:14.525007 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:00:15 crc kubenswrapper[4965]: I0318 12:00:15.191026 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:00:15 crc kubenswrapper[4965]: I0318 12:00:15.528994 4965 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7fccb5ad-ea8f-4354-82ce-8fc86b6cdfbc" Mar 18 12:00:15 crc kubenswrapper[4965]: I0318 12:00:15.529042 4965 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7fccb5ad-ea8f-4354-82ce-8fc86b6cdfbc" Mar 18 12:00:16 crc kubenswrapper[4965]: I0318 12:00:16.045826 4965 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="34425aac-9812-4716-9055-e2686fe1beb0" Mar 18 12:00:21 crc kubenswrapper[4965]: I0318 12:00:21.068608 4965 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 18 12:00:21 crc kubenswrapper[4965]: I0318 12:00:21.069996 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 18 12:00:24 crc kubenswrapper[4965]: I0318 12:00:24.248031 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 18 12:00:24 crc kubenswrapper[4965]: I0318 12:00:24.792594 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 18 12:00:24 crc kubenswrapper[4965]: I0318 12:00:24.891778 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 18 12:00:25 crc kubenswrapper[4965]: I0318 12:00:25.164735 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 18 12:00:25 crc kubenswrapper[4965]: I0318 12:00:25.730974 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 18 12:00:25 crc kubenswrapper[4965]: I0318 12:00:25.736543 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 18 12:00:25 crc kubenswrapper[4965]: I0318 12:00:25.932008 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 18 12:00:26 crc kubenswrapper[4965]: I0318 12:00:26.203958 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 12:00:26 crc kubenswrapper[4965]: I0318 12:00:26.252527 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 18 12:00:26 crc kubenswrapper[4965]: I0318 12:00:26.345996 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 12:00:26 crc kubenswrapper[4965]: I0318 12:00:26.464254 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 18 12:00:26 crc kubenswrapper[4965]: I0318 12:00:26.500090 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 18 12:00:26 crc kubenswrapper[4965]: I0318 12:00:26.613749 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 18 12:00:26 crc kubenswrapper[4965]: I0318 12:00:26.776183 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 18 12:00:26 crc kubenswrapper[4965]: I0318 12:00:26.963351 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 18 12:00:26 crc kubenswrapper[4965]: I0318 12:00:26.976828 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 18 12:00:27 crc kubenswrapper[4965]: I0318 12:00:27.103086 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 18 12:00:27 crc kubenswrapper[4965]: I0318 12:00:27.134482 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 18 12:00:27 crc kubenswrapper[4965]: I0318 12:00:27.148026 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 18 12:00:27 crc kubenswrapper[4965]: I0318 12:00:27.432012 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 18 12:00:27 crc kubenswrapper[4965]: I0318 12:00:27.495544 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 18 12:00:27 crc kubenswrapper[4965]: I0318 12:00:27.534700 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 18 12:00:27 crc kubenswrapper[4965]: I0318 12:00:27.568069 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 18 12:00:27 crc kubenswrapper[4965]: I0318 12:00:27.603885 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 18 12:00:27 crc kubenswrapper[4965]: I0318 12:00:27.617883 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 18 12:00:27 crc kubenswrapper[4965]: I0318 12:00:27.638215 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 18 12:00:27 crc kubenswrapper[4965]: I0318 12:00:27.673737 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 18 12:00:27 crc kubenswrapper[4965]: I0318 12:00:27.716495 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 18 12:00:27 crc kubenswrapper[4965]: I0318 12:00:27.933261 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 18 12:00:28 crc kubenswrapper[4965]: I0318 12:00:28.031279 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 18 12:00:28 crc kubenswrapper[4965]: I0318 12:00:28.292309 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 18 12:00:28 crc kubenswrapper[4965]: I0318 12:00:28.307270 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 18 12:00:28 crc kubenswrapper[4965]: I0318 12:00:28.320837 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 18 12:00:28 crc kubenswrapper[4965]: I0318 12:00:28.328607 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 18 12:00:28 crc kubenswrapper[4965]: I0318 12:00:28.374460 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 18 12:00:28 crc kubenswrapper[4965]: I0318 12:00:28.390575 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 18 12:00:28 crc kubenswrapper[4965]: I0318 12:00:28.476180 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 18 12:00:28 crc kubenswrapper[4965]: I0318 12:00:28.480433 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 18 12:00:28 crc kubenswrapper[4965]: I0318 12:00:28.511626 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 18 12:00:28 crc kubenswrapper[4965]: I0318 12:00:28.628391 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 18 12:00:28 crc kubenswrapper[4965]: I0318 12:00:28.792850 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 18 12:00:28 crc kubenswrapper[4965]: I0318 12:00:28.915244 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 12:00:28 crc kubenswrapper[4965]: I0318 12:00:28.960276 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 18 12:00:29 crc kubenswrapper[4965]: I0318 12:00:29.026776 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 18 12:00:29 crc kubenswrapper[4965]: I0318 12:00:29.056796 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 18 12:00:29 crc kubenswrapper[4965]: I0318 12:00:29.122391 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 18 12:00:29 crc kubenswrapper[4965]: I0318 12:00:29.136502 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 18 12:00:29 crc kubenswrapper[4965]: I0318 12:00:29.194355 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 18 12:00:29 crc kubenswrapper[4965]: I0318 12:00:29.205556 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 18 12:00:29 crc kubenswrapper[4965]: I0318 12:00:29.219509 4965 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 18 12:00:29 crc kubenswrapper[4965]: I0318 12:00:29.219698 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 18 12:00:29 crc kubenswrapper[4965]: I0318 12:00:29.241050 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 18 12:00:29 crc kubenswrapper[4965]: I0318 12:00:29.257615 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 18 12:00:29 crc kubenswrapper[4965]: I0318 12:00:29.268037 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 18 12:00:29 crc kubenswrapper[4965]: I0318 12:00:29.290000 4965 ???:1] "http: TLS handshake error from 192.168.126.11:48694: no serving certificate available for the kubelet" Mar 18 12:00:29 crc kubenswrapper[4965]: I0318 12:00:29.432579 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 18 12:00:29 crc kubenswrapper[4965]: I0318 12:00:29.480170 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 18 12:00:29 crc kubenswrapper[4965]: I0318 12:00:29.498305 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 18 12:00:29 crc kubenswrapper[4965]: I0318 12:00:29.508281 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 18 12:00:29 crc kubenswrapper[4965]: I0318 12:00:29.528396 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 12:00:29 crc kubenswrapper[4965]: I0318 12:00:29.611627 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 18 12:00:29 crc kubenswrapper[4965]: I0318 12:00:29.666083 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 18 12:00:29 crc kubenswrapper[4965]: I0318 12:00:29.799204 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 18 12:00:29 crc kubenswrapper[4965]: I0318 12:00:29.801433 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 18 12:00:29 crc kubenswrapper[4965]: I0318 12:00:29.804006 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 18 12:00:29 crc kubenswrapper[4965]: I0318 12:00:29.840304 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 12:00:29 crc kubenswrapper[4965]: I0318 12:00:29.869564 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 18 12:00:29 crc kubenswrapper[4965]: I0318 12:00:29.897540 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 18 12:00:30 crc kubenswrapper[4965]: I0318 12:00:30.079384 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 18 12:00:30 crc kubenswrapper[4965]: I0318 12:00:30.118387 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 18 12:00:30 crc kubenswrapper[4965]: I0318 12:00:30.241605 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 18 12:00:30 crc kubenswrapper[4965]: I0318 12:00:30.279364 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 18 12:00:30 crc kubenswrapper[4965]: I0318 12:00:30.326250 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 18 12:00:30 crc kubenswrapper[4965]: I0318 12:00:30.347331 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 18 12:00:30 crc kubenswrapper[4965]: I0318 12:00:30.388276 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 18 12:00:30 crc kubenswrapper[4965]: I0318 12:00:30.476542 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 18 12:00:30 crc kubenswrapper[4965]: I0318 12:00:30.519879 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 18 12:00:30 crc kubenswrapper[4965]: I0318 12:00:30.530202 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 18 12:00:30 crc kubenswrapper[4965]: I0318 12:00:30.532458 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 12:00:30 crc kubenswrapper[4965]: I0318 12:00:30.589735 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 18 12:00:30 crc kubenswrapper[4965]: I0318 12:00:30.662454 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 18 12:00:30 crc kubenswrapper[4965]: I0318 12:00:30.702830 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 18 12:00:30 crc kubenswrapper[4965]: I0318 12:00:30.740270 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 18 12:00:30 crc kubenswrapper[4965]: I0318 12:00:30.775222 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 18 12:00:30 crc kubenswrapper[4965]: I0318 12:00:30.848803 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 18 12:00:30 crc kubenswrapper[4965]: I0318 12:00:30.958930 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 18 12:00:30 crc kubenswrapper[4965]: I0318 12:00:30.962487 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 18 12:00:31 crc kubenswrapper[4965]: I0318 12:00:31.062695 4965 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 18 12:00:31 crc kubenswrapper[4965]: I0318 12:00:31.068788 4965 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 18 12:00:31 crc kubenswrapper[4965]: I0318 12:00:31.068921 4965 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 18 12:00:31 crc kubenswrapper[4965]: I0318 12:00:31.068993 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:00:31 crc kubenswrapper[4965]: I0318 12:00:31.069971 4965 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"ee73de6fc8900ed347a66c5c1281716c698175d4d7ba52f4424c60f8f2730af0"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Mar 18 12:00:31 crc kubenswrapper[4965]: I0318 12:00:31.070220 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://ee73de6fc8900ed347a66c5c1281716c698175d4d7ba52f4424c60f8f2730af0" gracePeriod=30 Mar 18 12:00:31 crc kubenswrapper[4965]: I0318 12:00:31.119015 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 18 12:00:31 crc kubenswrapper[4965]: I0318 12:00:31.179598 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 18 12:00:31 crc kubenswrapper[4965]: I0318 12:00:31.189527 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 18 12:00:31 crc kubenswrapper[4965]: I0318 12:00:31.198416 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 18 12:00:31 crc kubenswrapper[4965]: I0318 12:00:31.261274 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 18 12:00:31 crc kubenswrapper[4965]: I0318 12:00:31.286399 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 18 12:00:31 crc kubenswrapper[4965]: I0318 12:00:31.366815 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 18 12:00:31 crc kubenswrapper[4965]: I0318 12:00:31.380488 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 18 12:00:31 crc kubenswrapper[4965]: I0318 12:00:31.430238 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 18 12:00:31 crc kubenswrapper[4965]: I0318 12:00:31.432067 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 18 12:00:31 crc kubenswrapper[4965]: I0318 12:00:31.446894 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 18 12:00:31 crc kubenswrapper[4965]: I0318 12:00:31.528907 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 18 12:00:31 crc kubenswrapper[4965]: I0318 12:00:31.590118 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 18 12:00:31 crc kubenswrapper[4965]: I0318 12:00:31.728197 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 18 12:00:31 crc kubenswrapper[4965]: I0318 12:00:31.808005 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 18 12:00:31 crc kubenswrapper[4965]: I0318 12:00:31.822821 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 18 12:00:31 crc kubenswrapper[4965]: I0318 12:00:31.903894 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 18 12:00:31 crc kubenswrapper[4965]: I0318 12:00:31.921526 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 18 12:00:31 crc kubenswrapper[4965]: I0318 12:00:31.973964 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 18 12:00:31 crc kubenswrapper[4965]: I0318 12:00:31.975674 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 18 12:00:31 crc kubenswrapper[4965]: I0318 12:00:31.995382 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 18 12:00:31 crc kubenswrapper[4965]: I0318 12:00:31.997695 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 18 12:00:32 crc kubenswrapper[4965]: I0318 12:00:32.088295 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 18 12:00:32 crc kubenswrapper[4965]: I0318 12:00:32.103287 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 18 12:00:32 crc kubenswrapper[4965]: I0318 12:00:32.125129 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 18 12:00:32 crc kubenswrapper[4965]: I0318 12:00:32.142254 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 12:00:32 crc kubenswrapper[4965]: I0318 12:00:32.267562 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 18 12:00:32 crc kubenswrapper[4965]: I0318 12:00:32.350970 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 18 12:00:32 crc kubenswrapper[4965]: I0318 12:00:32.589327 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 18 12:00:32 crc kubenswrapper[4965]: I0318 12:00:32.591276 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 18 12:00:32 crc kubenswrapper[4965]: I0318 12:00:32.636763 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 12:00:33 crc kubenswrapper[4965]: I0318 12:00:33.054997 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 18 12:00:33 crc kubenswrapper[4965]: I0318 12:00:33.127688 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 18 12:00:33 crc kubenswrapper[4965]: I0318 12:00:33.154073 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 18 12:00:33 crc kubenswrapper[4965]: I0318 12:00:33.209460 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 18 12:00:33 crc kubenswrapper[4965]: I0318 12:00:33.230286 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 18 12:00:33 crc kubenswrapper[4965]: I0318 12:00:33.273257 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 18 12:00:33 crc kubenswrapper[4965]: I0318 12:00:33.284866 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 18 12:00:33 crc kubenswrapper[4965]: I0318 12:00:33.319648 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 18 12:00:33 crc kubenswrapper[4965]: I0318 12:00:33.320942 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 18 12:00:33 crc kubenswrapper[4965]: I0318 12:00:33.409435 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 18 12:00:33 crc kubenswrapper[4965]: I0318 12:00:33.418986 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 18 12:00:33 crc kubenswrapper[4965]: I0318 12:00:33.434762 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 18 12:00:33 crc kubenswrapper[4965]: I0318 12:00:33.521909 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 18 12:00:33 crc kubenswrapper[4965]: I0318 12:00:33.567484 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 18 12:00:33 crc kubenswrapper[4965]: I0318 12:00:33.626966 4965 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 18 12:00:33 crc kubenswrapper[4965]: I0318 12:00:33.688368 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 12:00:33 crc kubenswrapper[4965]: I0318 12:00:33.781486 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 18 12:00:33 crc kubenswrapper[4965]: I0318 12:00:33.782750 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 18 12:00:33 crc kubenswrapper[4965]: I0318 12:00:33.808793 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 18 12:00:33 crc kubenswrapper[4965]: I0318 12:00:33.841550 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 18 12:00:33 crc kubenswrapper[4965]: I0318 12:00:33.872981 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 18 12:00:33 crc kubenswrapper[4965]: I0318 12:00:33.945530 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 18 12:00:33 crc kubenswrapper[4965]: I0318 12:00:33.980945 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 18 12:00:34 crc kubenswrapper[4965]: I0318 12:00:34.103730 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 18 12:00:34 crc kubenswrapper[4965]: I0318 12:00:34.159817 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 18 12:00:34 crc kubenswrapper[4965]: I0318 12:00:34.200254 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 18 12:00:34 crc kubenswrapper[4965]: I0318 12:00:34.208206 4965 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 18 12:00:34 crc kubenswrapper[4965]: I0318 12:00:34.323746 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 18 12:00:34 crc kubenswrapper[4965]: I0318 12:00:34.357931 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 18 12:00:34 crc kubenswrapper[4965]: I0318 12:00:34.456651 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 18 12:00:34 crc kubenswrapper[4965]: I0318 12:00:34.524133 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 12:00:34 crc kubenswrapper[4965]: I0318 12:00:34.603694 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 18 12:00:34 crc kubenswrapper[4965]: I0318 12:00:34.686605 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 18 12:00:34 crc kubenswrapper[4965]: I0318 12:00:34.739806 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 18 12:00:34 crc kubenswrapper[4965]: I0318 12:00:34.739812 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 18 12:00:34 crc kubenswrapper[4965]: I0318 12:00:34.777745 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 18 12:00:34 crc kubenswrapper[4965]: I0318 12:00:34.780332 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 18 12:00:34 crc kubenswrapper[4965]: I0318 12:00:34.821981 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 18 12:00:34 crc kubenswrapper[4965]: I0318 12:00:34.876827 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 12:00:34 crc kubenswrapper[4965]: I0318 12:00:34.922872 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 18 12:00:35 crc kubenswrapper[4965]: I0318 12:00:35.088646 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 18 12:00:35 crc kubenswrapper[4965]: I0318 12:00:35.117412 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 18 12:00:35 crc kubenswrapper[4965]: I0318 12:00:35.211608 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 18 12:00:35 crc kubenswrapper[4965]: I0318 12:00:35.306246 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 18 12:00:35 crc kubenswrapper[4965]: I0318 12:00:35.323545 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 18 12:00:35 crc kubenswrapper[4965]: I0318 12:00:35.345214 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 18 12:00:35 crc kubenswrapper[4965]: I0318 12:00:35.363851 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 18 12:00:35 crc kubenswrapper[4965]: I0318 12:00:35.467461 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 18 12:00:35 crc kubenswrapper[4965]: I0318 12:00:35.511250 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 18 12:00:35 crc kubenswrapper[4965]: I0318 12:00:35.548979 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 18 12:00:35 crc kubenswrapper[4965]: I0318 12:00:35.575985 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 18 12:00:35 crc kubenswrapper[4965]: I0318 12:00:35.601422 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 18 12:00:35 crc kubenswrapper[4965]: I0318 12:00:35.601980 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 18 12:00:35 crc kubenswrapper[4965]: I0318 12:00:35.699594 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 18 12:00:35 crc kubenswrapper[4965]: I0318 12:00:35.721140 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 18 12:00:35 crc kubenswrapper[4965]: I0318 12:00:35.721830 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 18 12:00:35 crc kubenswrapper[4965]: I0318 12:00:35.747725 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 12:00:35 crc kubenswrapper[4965]: I0318 12:00:35.773411 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 18 12:00:35 crc kubenswrapper[4965]: I0318 12:00:35.842269 4965 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 18 12:00:35 crc kubenswrapper[4965]: I0318 12:00:35.842521 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 18 12:00:36 crc kubenswrapper[4965]: I0318 12:00:36.023862 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 18 12:00:36 crc kubenswrapper[4965]: I0318 12:00:36.057965 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 18 12:00:36 crc kubenswrapper[4965]: I0318 12:00:36.081059 4965 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 18 12:00:36 crc kubenswrapper[4965]: I0318 12:00:36.102857 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 18 12:00:36 crc kubenswrapper[4965]: I0318 12:00:36.167062 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 12:00:36 crc kubenswrapper[4965]: I0318 12:00:36.253023 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 18 12:00:36 crc kubenswrapper[4965]: I0318 12:00:36.267843 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 18 12:00:36 crc kubenswrapper[4965]: I0318 12:00:36.341878 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 18 12:00:36 crc kubenswrapper[4965]: I0318 12:00:36.373649 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 12:00:36 crc kubenswrapper[4965]: I0318 12:00:36.474959 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 18 12:00:36 crc kubenswrapper[4965]: I0318 12:00:36.583164 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 18 12:00:36 crc kubenswrapper[4965]: I0318 12:00:36.631962 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 18 12:00:36 crc kubenswrapper[4965]: I0318 12:00:36.679107 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 18 12:00:36 crc kubenswrapper[4965]: I0318 12:00:36.692981 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 18 12:00:36 crc kubenswrapper[4965]: I0318 12:00:36.745935 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 18 12:00:36 crc kubenswrapper[4965]: I0318 12:00:36.765210 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 18 12:00:36 crc kubenswrapper[4965]: I0318 12:00:36.780077 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 18 12:00:36 crc kubenswrapper[4965]: I0318 12:00:36.863039 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 18 12:00:36 crc kubenswrapper[4965]: I0318 12:00:36.890364 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 18 12:00:36 crc kubenswrapper[4965]: I0318 12:00:36.988462 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 18 12:00:37 crc kubenswrapper[4965]: I0318 12:00:37.012766 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 18 12:00:37 crc kubenswrapper[4965]: I0318 12:00:37.015699 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 18 12:00:37 crc kubenswrapper[4965]: I0318 12:00:37.217608 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 18 12:00:37 crc kubenswrapper[4965]: I0318 12:00:37.305465 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 18 12:00:37 crc kubenswrapper[4965]: I0318 12:00:37.351412 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 18 12:00:37 crc kubenswrapper[4965]: I0318 12:00:37.431064 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 18 12:00:37 crc kubenswrapper[4965]: I0318 12:00:37.521431 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 18 12:00:37 crc kubenswrapper[4965]: I0318 12:00:37.668849 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 18 12:00:37 crc kubenswrapper[4965]: I0318 12:00:37.714172 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 18 12:00:37 crc kubenswrapper[4965]: I0318 12:00:37.876559 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 18 12:00:37 crc kubenswrapper[4965]: I0318 12:00:37.966254 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 18 12:00:38 crc kubenswrapper[4965]: I0318 12:00:38.030320 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 18 12:00:38 crc kubenswrapper[4965]: I0318 12:00:38.044826 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 18 12:00:38 crc kubenswrapper[4965]: I0318 12:00:38.060065 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 18 12:00:38 crc kubenswrapper[4965]: I0318 12:00:38.067032 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 18 12:00:38 crc kubenswrapper[4965]: I0318 12:00:38.120512 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 18 12:00:38 crc kubenswrapper[4965]: I0318 12:00:38.154798 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 18 12:00:38 crc kubenswrapper[4965]: I0318 12:00:38.246032 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 18 12:00:38 crc kubenswrapper[4965]: I0318 12:00:38.302835 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 18 12:00:38 crc kubenswrapper[4965]: I0318 12:00:38.344999 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 18 12:00:38 crc kubenswrapper[4965]: I0318 12:00:38.593527 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 18 12:00:38 crc kubenswrapper[4965]: I0318 12:00:38.632330 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 12:00:38 crc kubenswrapper[4965]: I0318 12:00:38.646565 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 18 12:00:38 crc kubenswrapper[4965]: I0318 12:00:38.690351 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 18 12:00:38 crc kubenswrapper[4965]: I0318 12:00:38.720642 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 18 12:00:38 crc kubenswrapper[4965]: I0318 12:00:38.771748 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 18 12:00:38 crc kubenswrapper[4965]: I0318 12:00:38.783972 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 18 12:00:38 crc kubenswrapper[4965]: I0318 12:00:38.875831 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 18 12:00:38 crc kubenswrapper[4965]: I0318 12:00:38.897214 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 18 12:00:38 crc kubenswrapper[4965]: I0318 12:00:38.948120 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 18 12:00:39 crc kubenswrapper[4965]: I0318 12:00:39.035988 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 18 12:00:39 crc kubenswrapper[4965]: I0318 12:00:39.114470 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 12:00:39 crc kubenswrapper[4965]: I0318 12:00:39.148219 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 18 12:00:39 crc kubenswrapper[4965]: I0318 12:00:39.160909 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 18 12:00:39 crc kubenswrapper[4965]: I0318 12:00:39.291242 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 18 12:00:39 crc kubenswrapper[4965]: I0318 12:00:39.302311 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 18 12:00:39 crc kubenswrapper[4965]: I0318 12:00:39.485251 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 12:00:39 crc kubenswrapper[4965]: I0318 12:00:39.554725 4965 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 18 12:00:39 crc kubenswrapper[4965]: I0318 12:00:39.558759 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 12:00:39 crc kubenswrapper[4965]: I0318 12:00:39.558805 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 12:00:39 crc kubenswrapper[4965]: I0318 12:00:39.562576 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:00:39 crc kubenswrapper[4965]: I0318 12:00:39.580866 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=25.580843414 podStartE2EDuration="25.580843414s" podCreationTimestamp="2026-03-18 12:00:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:00:39.574293146 +0000 UTC m=+244.560480625" watchObservedRunningTime="2026-03-18 12:00:39.580843414 +0000 UTC m=+244.567030903" Mar 18 12:00:39 crc kubenswrapper[4965]: I0318 12:00:39.738080 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 18 12:00:39 crc kubenswrapper[4965]: I0318 12:00:39.868313 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 18 12:00:40 crc kubenswrapper[4965]: I0318 12:00:40.045478 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 18 12:00:40 crc kubenswrapper[4965]: I0318 12:00:40.262701 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 18 12:00:40 crc kubenswrapper[4965]: I0318 12:00:40.317116 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 18 12:00:40 crc kubenswrapper[4965]: I0318 12:00:40.611994 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 18 12:00:40 crc kubenswrapper[4965]: I0318 12:00:40.623860 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 18 12:00:40 crc kubenswrapper[4965]: I0318 12:00:40.850529 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 18 12:00:41 crc kubenswrapper[4965]: I0318 12:00:41.201227 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 18 12:00:41 crc kubenswrapper[4965]: I0318 12:00:41.247263 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 18 12:00:41 crc kubenswrapper[4965]: I0318 12:00:41.304436 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 18 12:00:41 crc kubenswrapper[4965]: I0318 12:00:41.443349 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 18 12:00:41 crc kubenswrapper[4965]: I0318 12:00:41.601935 4965 patch_prober.go:28] interesting pod/machine-config-daemon-8l67n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:00:41 crc kubenswrapper[4965]: I0318 12:00:41.602006 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8l67n" podUID="e9a53215-1d0d-47de-92c4-cea3209fe4fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:00:41 crc kubenswrapper[4965]: I0318 12:00:41.627015 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 18 12:00:41 crc kubenswrapper[4965]: I0318 12:00:41.832085 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 18 12:00:42 crc kubenswrapper[4965]: I0318 12:00:42.683938 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 18 12:00:42 crc kubenswrapper[4965]: I0318 12:00:42.826382 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 18 12:00:43 crc kubenswrapper[4965]: I0318 12:00:43.227744 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 18 12:00:43 crc kubenswrapper[4965]: I0318 12:00:43.608222 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sjkft"] Mar 18 12:00:43 crc kubenswrapper[4965]: I0318 12:00:43.608493 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sjkft" podUID="c3614d69-f3b4-4496-af2f-d119c56de1c7" containerName="registry-server" containerID="cri-o://96cb4ed8007f5521ad25fbc1339a01105adfe6c8ebd54aacbbf6a7af040d6ae5" gracePeriod=30 Mar 18 12:00:43 crc kubenswrapper[4965]: I0318 12:00:43.622713 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5wjlx"] Mar 18 12:00:43 crc kubenswrapper[4965]: I0318 12:00:43.623467 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5wjlx" podUID="39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5" containerName="registry-server" containerID="cri-o://9c047746c7fee720b87490d138715f8689974916b5f46538d70fb83512b0bf6f" gracePeriod=30 Mar 18 12:00:43 crc kubenswrapper[4965]: I0318 12:00:43.632784 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8pdbx"] Mar 18 12:00:43 crc kubenswrapper[4965]: I0318 12:00:43.633083 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-8pdbx" podUID="beb408f9-638c-4bd9-b4ec-c72bc43286e3" containerName="marketplace-operator" containerID="cri-o://6141c19bfb5535d30bd817ac8fff790599c5ef472e9635ef6ff384b0a8294bda" gracePeriod=30 Mar 18 12:00:43 crc kubenswrapper[4965]: I0318 12:00:43.637054 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gwpmg"] Mar 18 12:00:43 crc kubenswrapper[4965]: I0318 12:00:43.637796 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gwpmg" podUID="f76b91e1-f768-4cb1-857d-6e3eb31e59f6" containerName="registry-server" containerID="cri-o://c4325f73e755e8ce62ff9b4bd7e7d386a96dcca05fca2e9e97fbaef59eb9446e" gracePeriod=30 Mar 18 12:00:43 crc kubenswrapper[4965]: I0318 12:00:43.649278 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zx7g7"] Mar 18 12:00:43 crc kubenswrapper[4965]: I0318 12:00:43.649889 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zx7g7" podUID="8edec177-0701-4d1c-bb61-33c0a05df51d" containerName="registry-server" containerID="cri-o://f74cec4d4cd2fc07c553b2ba8be7311beb1dd80ea98cfdb269f4ac0cae6f1f6d" gracePeriod=30 Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.141784 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5wjlx" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.162413 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5-catalog-content\") pod \"39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5\" (UID: \"39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5\") " Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.162470 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5-utilities\") pod \"39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5\" (UID: \"39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5\") " Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.162603 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-895zv\" (UniqueName: \"kubernetes.io/projected/39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5-kube-api-access-895zv\") pod \"39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5\" (UID: \"39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5\") " Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.165135 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5-utilities" (OuterVolumeSpecName: "utilities") pod "39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5" (UID: "39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.180684 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5-kube-api-access-895zv" (OuterVolumeSpecName: "kube-api-access-895zv") pod "39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5" (UID: "39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5"). InnerVolumeSpecName "kube-api-access-895zv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.227827 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5" (UID: "39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.232176 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zx7g7" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.238198 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gwpmg" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.241599 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8pdbx" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.246254 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sjkft" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.263522 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/beb408f9-638c-4bd9-b4ec-c72bc43286e3-marketplace-operator-metrics\") pod \"beb408f9-638c-4bd9-b4ec-c72bc43286e3\" (UID: \"beb408f9-638c-4bd9-b4ec-c72bc43286e3\") " Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.263573 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f76b91e1-f768-4cb1-857d-6e3eb31e59f6-utilities\") pod \"f76b91e1-f768-4cb1-857d-6e3eb31e59f6\" (UID: \"f76b91e1-f768-4cb1-857d-6e3eb31e59f6\") " Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.263598 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f76b91e1-f768-4cb1-857d-6e3eb31e59f6-catalog-content\") pod \"f76b91e1-f768-4cb1-857d-6e3eb31e59f6\" (UID: \"f76b91e1-f768-4cb1-857d-6e3eb31e59f6\") " Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.263619 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwnvg\" (UniqueName: \"kubernetes.io/projected/c3614d69-f3b4-4496-af2f-d119c56de1c7-kube-api-access-zwnvg\") pod \"c3614d69-f3b4-4496-af2f-d119c56de1c7\" (UID: \"c3614d69-f3b4-4496-af2f-d119c56de1c7\") " Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.263638 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxtzh\" (UniqueName: \"kubernetes.io/projected/8edec177-0701-4d1c-bb61-33c0a05df51d-kube-api-access-bxtzh\") pod \"8edec177-0701-4d1c-bb61-33c0a05df51d\" (UID: \"8edec177-0701-4d1c-bb61-33c0a05df51d\") " Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.263681 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8edec177-0701-4d1c-bb61-33c0a05df51d-catalog-content\") pod \"8edec177-0701-4d1c-bb61-33c0a05df51d\" (UID: \"8edec177-0701-4d1c-bb61-33c0a05df51d\") " Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.263703 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3614d69-f3b4-4496-af2f-d119c56de1c7-utilities\") pod \"c3614d69-f3b4-4496-af2f-d119c56de1c7\" (UID: \"c3614d69-f3b4-4496-af2f-d119c56de1c7\") " Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.263727 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/beb408f9-638c-4bd9-b4ec-c72bc43286e3-marketplace-trusted-ca\") pod \"beb408f9-638c-4bd9-b4ec-c72bc43286e3\" (UID: \"beb408f9-638c-4bd9-b4ec-c72bc43286e3\") " Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.263752 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwm5l\" (UniqueName: \"kubernetes.io/projected/beb408f9-638c-4bd9-b4ec-c72bc43286e3-kube-api-access-dwm5l\") pod \"beb408f9-638c-4bd9-b4ec-c72bc43286e3\" (UID: \"beb408f9-638c-4bd9-b4ec-c72bc43286e3\") " Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.263777 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3614d69-f3b4-4496-af2f-d119c56de1c7-catalog-content\") pod \"c3614d69-f3b4-4496-af2f-d119c56de1c7\" (UID: \"c3614d69-f3b4-4496-af2f-d119c56de1c7\") " Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.263795 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pww9d\" (UniqueName: \"kubernetes.io/projected/f76b91e1-f768-4cb1-857d-6e3eb31e59f6-kube-api-access-pww9d\") pod \"f76b91e1-f768-4cb1-857d-6e3eb31e59f6\" (UID: \"f76b91e1-f768-4cb1-857d-6e3eb31e59f6\") " Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.263812 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8edec177-0701-4d1c-bb61-33c0a05df51d-utilities\") pod \"8edec177-0701-4d1c-bb61-33c0a05df51d\" (UID: \"8edec177-0701-4d1c-bb61-33c0a05df51d\") " Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.263957 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-895zv\" (UniqueName: \"kubernetes.io/projected/39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5-kube-api-access-895zv\") on node \"crc\" DevicePath \"\"" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.264002 4965 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.264014 4965 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.264848 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8edec177-0701-4d1c-bb61-33c0a05df51d-utilities" (OuterVolumeSpecName: "utilities") pod "8edec177-0701-4d1c-bb61-33c0a05df51d" (UID: "8edec177-0701-4d1c-bb61-33c0a05df51d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.264896 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f76b91e1-f768-4cb1-857d-6e3eb31e59f6-utilities" (OuterVolumeSpecName: "utilities") pod "f76b91e1-f768-4cb1-857d-6e3eb31e59f6" (UID: "f76b91e1-f768-4cb1-857d-6e3eb31e59f6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.265258 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3614d69-f3b4-4496-af2f-d119c56de1c7-utilities" (OuterVolumeSpecName: "utilities") pod "c3614d69-f3b4-4496-af2f-d119c56de1c7" (UID: "c3614d69-f3b4-4496-af2f-d119c56de1c7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.265375 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/beb408f9-638c-4bd9-b4ec-c72bc43286e3-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "beb408f9-638c-4bd9-b4ec-c72bc43286e3" (UID: "beb408f9-638c-4bd9-b4ec-c72bc43286e3"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.268974 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8edec177-0701-4d1c-bb61-33c0a05df51d-kube-api-access-bxtzh" (OuterVolumeSpecName: "kube-api-access-bxtzh") pod "8edec177-0701-4d1c-bb61-33c0a05df51d" (UID: "8edec177-0701-4d1c-bb61-33c0a05df51d"). InnerVolumeSpecName "kube-api-access-bxtzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.269047 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3614d69-f3b4-4496-af2f-d119c56de1c7-kube-api-access-zwnvg" (OuterVolumeSpecName: "kube-api-access-zwnvg") pod "c3614d69-f3b4-4496-af2f-d119c56de1c7" (UID: "c3614d69-f3b4-4496-af2f-d119c56de1c7"). InnerVolumeSpecName "kube-api-access-zwnvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.269069 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f76b91e1-f768-4cb1-857d-6e3eb31e59f6-kube-api-access-pww9d" (OuterVolumeSpecName: "kube-api-access-pww9d") pod "f76b91e1-f768-4cb1-857d-6e3eb31e59f6" (UID: "f76b91e1-f768-4cb1-857d-6e3eb31e59f6"). InnerVolumeSpecName "kube-api-access-pww9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.269512 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/beb408f9-638c-4bd9-b4ec-c72bc43286e3-kube-api-access-dwm5l" (OuterVolumeSpecName: "kube-api-access-dwm5l") pod "beb408f9-638c-4bd9-b4ec-c72bc43286e3" (UID: "beb408f9-638c-4bd9-b4ec-c72bc43286e3"). InnerVolumeSpecName "kube-api-access-dwm5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.269957 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beb408f9-638c-4bd9-b4ec-c72bc43286e3-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "beb408f9-638c-4bd9-b4ec-c72bc43286e3" (UID: "beb408f9-638c-4bd9-b4ec-c72bc43286e3"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.302353 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f76b91e1-f768-4cb1-857d-6e3eb31e59f6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f76b91e1-f768-4cb1-857d-6e3eb31e59f6" (UID: "f76b91e1-f768-4cb1-857d-6e3eb31e59f6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.330266 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3614d69-f3b4-4496-af2f-d119c56de1c7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c3614d69-f3b4-4496-af2f-d119c56de1c7" (UID: "c3614d69-f3b4-4496-af2f-d119c56de1c7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.364720 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pww9d\" (UniqueName: \"kubernetes.io/projected/f76b91e1-f768-4cb1-857d-6e3eb31e59f6-kube-api-access-pww9d\") on node \"crc\" DevicePath \"\"" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.364767 4965 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8edec177-0701-4d1c-bb61-33c0a05df51d-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.364788 4965 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/beb408f9-638c-4bd9-b4ec-c72bc43286e3-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.364806 4965 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f76b91e1-f768-4cb1-857d-6e3eb31e59f6-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.364822 4965 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f76b91e1-f768-4cb1-857d-6e3eb31e59f6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.364839 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwnvg\" (UniqueName: \"kubernetes.io/projected/c3614d69-f3b4-4496-af2f-d119c56de1c7-kube-api-access-zwnvg\") on node \"crc\" DevicePath \"\"" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.364856 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxtzh\" (UniqueName: \"kubernetes.io/projected/8edec177-0701-4d1c-bb61-33c0a05df51d-kube-api-access-bxtzh\") on node \"crc\" DevicePath \"\"" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.364871 4965 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3614d69-f3b4-4496-af2f-d119c56de1c7-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.364886 4965 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/beb408f9-638c-4bd9-b4ec-c72bc43286e3-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.364901 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwm5l\" (UniqueName: \"kubernetes.io/projected/beb408f9-638c-4bd9-b4ec-c72bc43286e3-kube-api-access-dwm5l\") on node \"crc\" DevicePath \"\"" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.364916 4965 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3614d69-f3b4-4496-af2f-d119c56de1c7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.413105 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8edec177-0701-4d1c-bb61-33c0a05df51d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8edec177-0701-4d1c-bb61-33c0a05df51d" (UID: "8edec177-0701-4d1c-bb61-33c0a05df51d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.465802 4965 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8edec177-0701-4d1c-bb61-33c0a05df51d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.695409 4965 generic.go:334] "Generic (PLEG): container finished" podID="39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5" containerID="9c047746c7fee720b87490d138715f8689974916b5f46538d70fb83512b0bf6f" exitCode=0 Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.695487 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5wjlx" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.695491 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5wjlx" event={"ID":"39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5","Type":"ContainerDied","Data":"9c047746c7fee720b87490d138715f8689974916b5f46538d70fb83512b0bf6f"} Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.695627 4965 scope.go:117] "RemoveContainer" containerID="9c047746c7fee720b87490d138715f8689974916b5f46538d70fb83512b0bf6f" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.695848 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5wjlx" event={"ID":"39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5","Type":"ContainerDied","Data":"8a0202d4beb33ca70efb1d561f22b4fdb3c95166da0d747508231f1ae42518fa"} Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.699530 4965 generic.go:334] "Generic (PLEG): container finished" podID="c3614d69-f3b4-4496-af2f-d119c56de1c7" containerID="96cb4ed8007f5521ad25fbc1339a01105adfe6c8ebd54aacbbf6a7af040d6ae5" exitCode=0 Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.699590 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sjkft" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.699593 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjkft" event={"ID":"c3614d69-f3b4-4496-af2f-d119c56de1c7","Type":"ContainerDied","Data":"96cb4ed8007f5521ad25fbc1339a01105adfe6c8ebd54aacbbf6a7af040d6ae5"} Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.699697 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjkft" event={"ID":"c3614d69-f3b4-4496-af2f-d119c56de1c7","Type":"ContainerDied","Data":"e75d438cda8551ccf31309f21ca2803ea5494c32f2b41b0f1792d9ff2805379b"} Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.701096 4965 generic.go:334] "Generic (PLEG): container finished" podID="beb408f9-638c-4bd9-b4ec-c72bc43286e3" containerID="6141c19bfb5535d30bd817ac8fff790599c5ef472e9635ef6ff384b0a8294bda" exitCode=0 Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.701173 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8pdbx" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.701205 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8pdbx" event={"ID":"beb408f9-638c-4bd9-b4ec-c72bc43286e3","Type":"ContainerDied","Data":"6141c19bfb5535d30bd817ac8fff790599c5ef472e9635ef6ff384b0a8294bda"} Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.701236 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8pdbx" event={"ID":"beb408f9-638c-4bd9-b4ec-c72bc43286e3","Type":"ContainerDied","Data":"e59d8ce29c399f6c8778404556cd9f40a79f29b6aac26619b624302a8b88dc72"} Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.709355 4965 generic.go:334] "Generic (PLEG): container finished" podID="8edec177-0701-4d1c-bb61-33c0a05df51d" containerID="f74cec4d4cd2fc07c553b2ba8be7311beb1dd80ea98cfdb269f4ac0cae6f1f6d" exitCode=0 Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.709420 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zx7g7" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.709430 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zx7g7" event={"ID":"8edec177-0701-4d1c-bb61-33c0a05df51d","Type":"ContainerDied","Data":"f74cec4d4cd2fc07c553b2ba8be7311beb1dd80ea98cfdb269f4ac0cae6f1f6d"} Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.709925 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zx7g7" event={"ID":"8edec177-0701-4d1c-bb61-33c0a05df51d","Type":"ContainerDied","Data":"d57b2fc4a17c699444ebe3febc74bacda7beea71728002c70a90d7be5c26c16b"} Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.714676 4965 generic.go:334] "Generic (PLEG): container finished" podID="f76b91e1-f768-4cb1-857d-6e3eb31e59f6" containerID="c4325f73e755e8ce62ff9b4bd7e7d386a96dcca05fca2e9e97fbaef59eb9446e" exitCode=0 Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.714704 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gwpmg" event={"ID":"f76b91e1-f768-4cb1-857d-6e3eb31e59f6","Type":"ContainerDied","Data":"c4325f73e755e8ce62ff9b4bd7e7d386a96dcca05fca2e9e97fbaef59eb9446e"} Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.714726 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gwpmg" event={"ID":"f76b91e1-f768-4cb1-857d-6e3eb31e59f6","Type":"ContainerDied","Data":"f5240c39614a6d1944304139b359ff5c416d3939337b76bbf246b2e767b8d847"} Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.714843 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gwpmg" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.719192 4965 scope.go:117] "RemoveContainer" containerID="5d8809f5914ab3ab48d3714cbcaa316e1897b7b7460703533a0bfb1a835f245d" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.736974 4965 scope.go:117] "RemoveContainer" containerID="867db37fa20fe9d31eb0ed1ff6142f3377c9dbca39c9b2cbb04c84eaf5354a64" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.739835 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5wjlx"] Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.747708 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5wjlx"] Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.757414 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sjkft"] Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.760609 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sjkft"] Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.778731 4965 scope.go:117] "RemoveContainer" containerID="9c047746c7fee720b87490d138715f8689974916b5f46538d70fb83512b0bf6f" Mar 18 12:00:44 crc kubenswrapper[4965]: E0318 12:00:44.779094 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c047746c7fee720b87490d138715f8689974916b5f46538d70fb83512b0bf6f\": container with ID starting with 9c047746c7fee720b87490d138715f8689974916b5f46538d70fb83512b0bf6f not found: ID does not exist" containerID="9c047746c7fee720b87490d138715f8689974916b5f46538d70fb83512b0bf6f" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.779129 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c047746c7fee720b87490d138715f8689974916b5f46538d70fb83512b0bf6f"} err="failed to get container status \"9c047746c7fee720b87490d138715f8689974916b5f46538d70fb83512b0bf6f\": rpc error: code = NotFound desc = could not find container \"9c047746c7fee720b87490d138715f8689974916b5f46538d70fb83512b0bf6f\": container with ID starting with 9c047746c7fee720b87490d138715f8689974916b5f46538d70fb83512b0bf6f not found: ID does not exist" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.779151 4965 scope.go:117] "RemoveContainer" containerID="5d8809f5914ab3ab48d3714cbcaa316e1897b7b7460703533a0bfb1a835f245d" Mar 18 12:00:44 crc kubenswrapper[4965]: E0318 12:00:44.779398 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d8809f5914ab3ab48d3714cbcaa316e1897b7b7460703533a0bfb1a835f245d\": container with ID starting with 5d8809f5914ab3ab48d3714cbcaa316e1897b7b7460703533a0bfb1a835f245d not found: ID does not exist" containerID="5d8809f5914ab3ab48d3714cbcaa316e1897b7b7460703533a0bfb1a835f245d" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.779459 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d8809f5914ab3ab48d3714cbcaa316e1897b7b7460703533a0bfb1a835f245d"} err="failed to get container status \"5d8809f5914ab3ab48d3714cbcaa316e1897b7b7460703533a0bfb1a835f245d\": rpc error: code = NotFound desc = could not find container \"5d8809f5914ab3ab48d3714cbcaa316e1897b7b7460703533a0bfb1a835f245d\": container with ID starting with 5d8809f5914ab3ab48d3714cbcaa316e1897b7b7460703533a0bfb1a835f245d not found: ID does not exist" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.779476 4965 scope.go:117] "RemoveContainer" containerID="867db37fa20fe9d31eb0ed1ff6142f3377c9dbca39c9b2cbb04c84eaf5354a64" Mar 18 12:00:44 crc kubenswrapper[4965]: E0318 12:00:44.780699 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"867db37fa20fe9d31eb0ed1ff6142f3377c9dbca39c9b2cbb04c84eaf5354a64\": container with ID starting with 867db37fa20fe9d31eb0ed1ff6142f3377c9dbca39c9b2cbb04c84eaf5354a64 not found: ID does not exist" containerID="867db37fa20fe9d31eb0ed1ff6142f3377c9dbca39c9b2cbb04c84eaf5354a64" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.780726 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"867db37fa20fe9d31eb0ed1ff6142f3377c9dbca39c9b2cbb04c84eaf5354a64"} err="failed to get container status \"867db37fa20fe9d31eb0ed1ff6142f3377c9dbca39c9b2cbb04c84eaf5354a64\": rpc error: code = NotFound desc = could not find container \"867db37fa20fe9d31eb0ed1ff6142f3377c9dbca39c9b2cbb04c84eaf5354a64\": container with ID starting with 867db37fa20fe9d31eb0ed1ff6142f3377c9dbca39c9b2cbb04c84eaf5354a64 not found: ID does not exist" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.780743 4965 scope.go:117] "RemoveContainer" containerID="96cb4ed8007f5521ad25fbc1339a01105adfe6c8ebd54aacbbf6a7af040d6ae5" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.784499 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8pdbx"] Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.792444 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8pdbx"] Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.796348 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zx7g7"] Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.803916 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zx7g7"] Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.807999 4965 scope.go:117] "RemoveContainer" containerID="c0e7dc80203cb3df3f3ced8cca96f483e671b379bb247bd04500d14affa27860" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.808740 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gwpmg"] Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.814155 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gwpmg"] Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.828998 4965 scope.go:117] "RemoveContainer" containerID="e9b8adedeb83db4fa95df036cfd796ad623de694f22bf67c6298b30e9f6e2025" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.845158 4965 scope.go:117] "RemoveContainer" containerID="96cb4ed8007f5521ad25fbc1339a01105adfe6c8ebd54aacbbf6a7af040d6ae5" Mar 18 12:00:44 crc kubenswrapper[4965]: E0318 12:00:44.845588 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96cb4ed8007f5521ad25fbc1339a01105adfe6c8ebd54aacbbf6a7af040d6ae5\": container with ID starting with 96cb4ed8007f5521ad25fbc1339a01105adfe6c8ebd54aacbbf6a7af040d6ae5 not found: ID does not exist" containerID="96cb4ed8007f5521ad25fbc1339a01105adfe6c8ebd54aacbbf6a7af040d6ae5" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.845629 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96cb4ed8007f5521ad25fbc1339a01105adfe6c8ebd54aacbbf6a7af040d6ae5"} err="failed to get container status \"96cb4ed8007f5521ad25fbc1339a01105adfe6c8ebd54aacbbf6a7af040d6ae5\": rpc error: code = NotFound desc = could not find container \"96cb4ed8007f5521ad25fbc1339a01105adfe6c8ebd54aacbbf6a7af040d6ae5\": container with ID starting with 96cb4ed8007f5521ad25fbc1339a01105adfe6c8ebd54aacbbf6a7af040d6ae5 not found: ID does not exist" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.845693 4965 scope.go:117] "RemoveContainer" containerID="c0e7dc80203cb3df3f3ced8cca96f483e671b379bb247bd04500d14affa27860" Mar 18 12:00:44 crc kubenswrapper[4965]: E0318 12:00:44.846078 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0e7dc80203cb3df3f3ced8cca96f483e671b379bb247bd04500d14affa27860\": container with ID starting with c0e7dc80203cb3df3f3ced8cca96f483e671b379bb247bd04500d14affa27860 not found: ID does not exist" containerID="c0e7dc80203cb3df3f3ced8cca96f483e671b379bb247bd04500d14affa27860" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.846109 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0e7dc80203cb3df3f3ced8cca96f483e671b379bb247bd04500d14affa27860"} err="failed to get container status \"c0e7dc80203cb3df3f3ced8cca96f483e671b379bb247bd04500d14affa27860\": rpc error: code = NotFound desc = could not find container \"c0e7dc80203cb3df3f3ced8cca96f483e671b379bb247bd04500d14affa27860\": container with ID starting with c0e7dc80203cb3df3f3ced8cca96f483e671b379bb247bd04500d14affa27860 not found: ID does not exist" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.846126 4965 scope.go:117] "RemoveContainer" containerID="e9b8adedeb83db4fa95df036cfd796ad623de694f22bf67c6298b30e9f6e2025" Mar 18 12:00:44 crc kubenswrapper[4965]: E0318 12:00:44.846497 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9b8adedeb83db4fa95df036cfd796ad623de694f22bf67c6298b30e9f6e2025\": container with ID starting with e9b8adedeb83db4fa95df036cfd796ad623de694f22bf67c6298b30e9f6e2025 not found: ID does not exist" containerID="e9b8adedeb83db4fa95df036cfd796ad623de694f22bf67c6298b30e9f6e2025" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.846524 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9b8adedeb83db4fa95df036cfd796ad623de694f22bf67c6298b30e9f6e2025"} err="failed to get container status \"e9b8adedeb83db4fa95df036cfd796ad623de694f22bf67c6298b30e9f6e2025\": rpc error: code = NotFound desc = could not find container \"e9b8adedeb83db4fa95df036cfd796ad623de694f22bf67c6298b30e9f6e2025\": container with ID starting with e9b8adedeb83db4fa95df036cfd796ad623de694f22bf67c6298b30e9f6e2025 not found: ID does not exist" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.846541 4965 scope.go:117] "RemoveContainer" containerID="6141c19bfb5535d30bd817ac8fff790599c5ef472e9635ef6ff384b0a8294bda" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.858301 4965 scope.go:117] "RemoveContainer" containerID="6141c19bfb5535d30bd817ac8fff790599c5ef472e9635ef6ff384b0a8294bda" Mar 18 12:00:44 crc kubenswrapper[4965]: E0318 12:00:44.858589 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6141c19bfb5535d30bd817ac8fff790599c5ef472e9635ef6ff384b0a8294bda\": container with ID starting with 6141c19bfb5535d30bd817ac8fff790599c5ef472e9635ef6ff384b0a8294bda not found: ID does not exist" containerID="6141c19bfb5535d30bd817ac8fff790599c5ef472e9635ef6ff384b0a8294bda" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.858623 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6141c19bfb5535d30bd817ac8fff790599c5ef472e9635ef6ff384b0a8294bda"} err="failed to get container status \"6141c19bfb5535d30bd817ac8fff790599c5ef472e9635ef6ff384b0a8294bda\": rpc error: code = NotFound desc = could not find container \"6141c19bfb5535d30bd817ac8fff790599c5ef472e9635ef6ff384b0a8294bda\": container with ID starting with 6141c19bfb5535d30bd817ac8fff790599c5ef472e9635ef6ff384b0a8294bda not found: ID does not exist" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.858652 4965 scope.go:117] "RemoveContainer" containerID="f74cec4d4cd2fc07c553b2ba8be7311beb1dd80ea98cfdb269f4ac0cae6f1f6d" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.869134 4965 scope.go:117] "RemoveContainer" containerID="9fca9c99465cc9f8ca5804e92a204bd74f72ef3c66a34d005141ee2a779f084d" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.886309 4965 scope.go:117] "RemoveContainer" containerID="e1931fa95c6a701f734dd58e67655754465994159603a5bb3d726ae9e7141e4b" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.899485 4965 scope.go:117] "RemoveContainer" containerID="f74cec4d4cd2fc07c553b2ba8be7311beb1dd80ea98cfdb269f4ac0cae6f1f6d" Mar 18 12:00:44 crc kubenswrapper[4965]: E0318 12:00:44.899949 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f74cec4d4cd2fc07c553b2ba8be7311beb1dd80ea98cfdb269f4ac0cae6f1f6d\": container with ID starting with f74cec4d4cd2fc07c553b2ba8be7311beb1dd80ea98cfdb269f4ac0cae6f1f6d not found: ID does not exist" containerID="f74cec4d4cd2fc07c553b2ba8be7311beb1dd80ea98cfdb269f4ac0cae6f1f6d" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.899975 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f74cec4d4cd2fc07c553b2ba8be7311beb1dd80ea98cfdb269f4ac0cae6f1f6d"} err="failed to get container status \"f74cec4d4cd2fc07c553b2ba8be7311beb1dd80ea98cfdb269f4ac0cae6f1f6d\": rpc error: code = NotFound desc = could not find container \"f74cec4d4cd2fc07c553b2ba8be7311beb1dd80ea98cfdb269f4ac0cae6f1f6d\": container with ID starting with f74cec4d4cd2fc07c553b2ba8be7311beb1dd80ea98cfdb269f4ac0cae6f1f6d not found: ID does not exist" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.899997 4965 scope.go:117] "RemoveContainer" containerID="9fca9c99465cc9f8ca5804e92a204bd74f72ef3c66a34d005141ee2a779f084d" Mar 18 12:00:44 crc kubenswrapper[4965]: E0318 12:00:44.900279 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fca9c99465cc9f8ca5804e92a204bd74f72ef3c66a34d005141ee2a779f084d\": container with ID starting with 9fca9c99465cc9f8ca5804e92a204bd74f72ef3c66a34d005141ee2a779f084d not found: ID does not exist" containerID="9fca9c99465cc9f8ca5804e92a204bd74f72ef3c66a34d005141ee2a779f084d" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.900299 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fca9c99465cc9f8ca5804e92a204bd74f72ef3c66a34d005141ee2a779f084d"} err="failed to get container status \"9fca9c99465cc9f8ca5804e92a204bd74f72ef3c66a34d005141ee2a779f084d\": rpc error: code = NotFound desc = could not find container \"9fca9c99465cc9f8ca5804e92a204bd74f72ef3c66a34d005141ee2a779f084d\": container with ID starting with 9fca9c99465cc9f8ca5804e92a204bd74f72ef3c66a34d005141ee2a779f084d not found: ID does not exist" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.900313 4965 scope.go:117] "RemoveContainer" containerID="e1931fa95c6a701f734dd58e67655754465994159603a5bb3d726ae9e7141e4b" Mar 18 12:00:44 crc kubenswrapper[4965]: E0318 12:00:44.900540 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1931fa95c6a701f734dd58e67655754465994159603a5bb3d726ae9e7141e4b\": container with ID starting with e1931fa95c6a701f734dd58e67655754465994159603a5bb3d726ae9e7141e4b not found: ID does not exist" containerID="e1931fa95c6a701f734dd58e67655754465994159603a5bb3d726ae9e7141e4b" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.900557 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1931fa95c6a701f734dd58e67655754465994159603a5bb3d726ae9e7141e4b"} err="failed to get container status \"e1931fa95c6a701f734dd58e67655754465994159603a5bb3d726ae9e7141e4b\": rpc error: code = NotFound desc = could not find container \"e1931fa95c6a701f734dd58e67655754465994159603a5bb3d726ae9e7141e4b\": container with ID starting with e1931fa95c6a701f734dd58e67655754465994159603a5bb3d726ae9e7141e4b not found: ID does not exist" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.900568 4965 scope.go:117] "RemoveContainer" containerID="c4325f73e755e8ce62ff9b4bd7e7d386a96dcca05fca2e9e97fbaef59eb9446e" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.916067 4965 scope.go:117] "RemoveContainer" containerID="954f130938e5373601c9fd30dac626fc3554b044dec5760e4a00b31a5ef6d876" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.929231 4965 scope.go:117] "RemoveContainer" containerID="0a996c7c173b97f114336185550cfccc9a982b74e228d1eed57002d0ba0496eb" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.942997 4965 scope.go:117] "RemoveContainer" containerID="c4325f73e755e8ce62ff9b4bd7e7d386a96dcca05fca2e9e97fbaef59eb9446e" Mar 18 12:00:44 crc kubenswrapper[4965]: E0318 12:00:44.943647 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4325f73e755e8ce62ff9b4bd7e7d386a96dcca05fca2e9e97fbaef59eb9446e\": container with ID starting with c4325f73e755e8ce62ff9b4bd7e7d386a96dcca05fca2e9e97fbaef59eb9446e not found: ID does not exist" containerID="c4325f73e755e8ce62ff9b4bd7e7d386a96dcca05fca2e9e97fbaef59eb9446e" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.943750 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4325f73e755e8ce62ff9b4bd7e7d386a96dcca05fca2e9e97fbaef59eb9446e"} err="failed to get container status \"c4325f73e755e8ce62ff9b4bd7e7d386a96dcca05fca2e9e97fbaef59eb9446e\": rpc error: code = NotFound desc = could not find container \"c4325f73e755e8ce62ff9b4bd7e7d386a96dcca05fca2e9e97fbaef59eb9446e\": container with ID starting with c4325f73e755e8ce62ff9b4bd7e7d386a96dcca05fca2e9e97fbaef59eb9446e not found: ID does not exist" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.943790 4965 scope.go:117] "RemoveContainer" containerID="954f130938e5373601c9fd30dac626fc3554b044dec5760e4a00b31a5ef6d876" Mar 18 12:00:44 crc kubenswrapper[4965]: E0318 12:00:44.944321 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"954f130938e5373601c9fd30dac626fc3554b044dec5760e4a00b31a5ef6d876\": container with ID starting with 954f130938e5373601c9fd30dac626fc3554b044dec5760e4a00b31a5ef6d876 not found: ID does not exist" containerID="954f130938e5373601c9fd30dac626fc3554b044dec5760e4a00b31a5ef6d876" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.944370 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"954f130938e5373601c9fd30dac626fc3554b044dec5760e4a00b31a5ef6d876"} err="failed to get container status \"954f130938e5373601c9fd30dac626fc3554b044dec5760e4a00b31a5ef6d876\": rpc error: code = NotFound desc = could not find container \"954f130938e5373601c9fd30dac626fc3554b044dec5760e4a00b31a5ef6d876\": container with ID starting with 954f130938e5373601c9fd30dac626fc3554b044dec5760e4a00b31a5ef6d876 not found: ID does not exist" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.944399 4965 scope.go:117] "RemoveContainer" containerID="0a996c7c173b97f114336185550cfccc9a982b74e228d1eed57002d0ba0496eb" Mar 18 12:00:44 crc kubenswrapper[4965]: E0318 12:00:44.944921 4965 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a996c7c173b97f114336185550cfccc9a982b74e228d1eed57002d0ba0496eb\": container with ID starting with 0a996c7c173b97f114336185550cfccc9a982b74e228d1eed57002d0ba0496eb not found: ID does not exist" containerID="0a996c7c173b97f114336185550cfccc9a982b74e228d1eed57002d0ba0496eb" Mar 18 12:00:44 crc kubenswrapper[4965]: I0318 12:00:44.944967 4965 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a996c7c173b97f114336185550cfccc9a982b74e228d1eed57002d0ba0496eb"} err="failed to get container status \"0a996c7c173b97f114336185550cfccc9a982b74e228d1eed57002d0ba0496eb\": rpc error: code = NotFound desc = could not find container \"0a996c7c173b97f114336185550cfccc9a982b74e228d1eed57002d0ba0496eb\": container with ID starting with 0a996c7c173b97f114336185550cfccc9a982b74e228d1eed57002d0ba0496eb not found: ID does not exist" Mar 18 12:00:46 crc kubenswrapper[4965]: I0318 12:00:46.028813 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5" path="/var/lib/kubelet/pods/39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5/volumes" Mar 18 12:00:46 crc kubenswrapper[4965]: I0318 12:00:46.029557 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8edec177-0701-4d1c-bb61-33c0a05df51d" path="/var/lib/kubelet/pods/8edec177-0701-4d1c-bb61-33c0a05df51d/volumes" Mar 18 12:00:46 crc kubenswrapper[4965]: I0318 12:00:46.030248 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="beb408f9-638c-4bd9-b4ec-c72bc43286e3" path="/var/lib/kubelet/pods/beb408f9-638c-4bd9-b4ec-c72bc43286e3/volumes" Mar 18 12:00:46 crc kubenswrapper[4965]: I0318 12:00:46.031274 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3614d69-f3b4-4496-af2f-d119c56de1c7" path="/var/lib/kubelet/pods/c3614d69-f3b4-4496-af2f-d119c56de1c7/volumes" Mar 18 12:00:46 crc kubenswrapper[4965]: I0318 12:00:46.031991 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f76b91e1-f768-4cb1-857d-6e3eb31e59f6" path="/var/lib/kubelet/pods/f76b91e1-f768-4cb1-857d-6e3eb31e59f6/volumes" Mar 18 12:00:46 crc kubenswrapper[4965]: I0318 12:00:46.078769 4965 csr.go:261] certificate signing request csr-mrjw6 is approved, waiting to be issued Mar 18 12:00:48 crc kubenswrapper[4965]: I0318 12:00:48.362966 4965 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 12:00:48 crc kubenswrapper[4965]: I0318 12:00:48.363460 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://232d60ca378e6098c064c7121784e992af17a294a74508322c3a4d5dcae4f6f7" gracePeriod=5 Mar 18 12:00:53 crc kubenswrapper[4965]: I0318 12:00:53.777214 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 18 12:00:53 crc kubenswrapper[4965]: I0318 12:00:53.777566 4965 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="232d60ca378e6098c064c7121784e992af17a294a74508322c3a4d5dcae4f6f7" exitCode=137 Mar 18 12:00:53 crc kubenswrapper[4965]: I0318 12:00:53.936694 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 18 12:00:53 crc kubenswrapper[4965]: I0318 12:00:53.936791 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:00:54 crc kubenswrapper[4965]: I0318 12:00:54.093573 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 12:00:54 crc kubenswrapper[4965]: I0318 12:00:54.095096 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 12:00:54 crc kubenswrapper[4965]: I0318 12:00:54.095154 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 12:00:54 crc kubenswrapper[4965]: I0318 12:00:54.095201 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 12:00:54 crc kubenswrapper[4965]: I0318 12:00:54.095228 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 12:00:54 crc kubenswrapper[4965]: I0318 12:00:54.094114 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:00:54 crc kubenswrapper[4965]: I0318 12:00:54.096246 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:00:54 crc kubenswrapper[4965]: I0318 12:00:54.097103 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:00:54 crc kubenswrapper[4965]: I0318 12:00:54.097113 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:00:54 crc kubenswrapper[4965]: I0318 12:00:54.110608 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:00:54 crc kubenswrapper[4965]: I0318 12:00:54.196492 4965 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 18 12:00:54 crc kubenswrapper[4965]: I0318 12:00:54.196534 4965 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 12:00:54 crc kubenswrapper[4965]: I0318 12:00:54.196550 4965 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 18 12:00:54 crc kubenswrapper[4965]: I0318 12:00:54.196559 4965 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 18 12:00:54 crc kubenswrapper[4965]: I0318 12:00:54.196570 4965 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 12:00:54 crc kubenswrapper[4965]: I0318 12:00:54.784817 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 18 12:00:54 crc kubenswrapper[4965]: I0318 12:00:54.784898 4965 scope.go:117] "RemoveContainer" containerID="232d60ca378e6098c064c7121784e992af17a294a74508322c3a4d5dcae4f6f7" Mar 18 12:00:54 crc kubenswrapper[4965]: I0318 12:00:54.784962 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:00:56 crc kubenswrapper[4965]: I0318 12:00:56.030038 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 18 12:01:01 crc kubenswrapper[4965]: I0318 12:01:01.819798 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 18 12:01:01 crc kubenswrapper[4965]: I0318 12:01:01.821489 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 12:01:01 crc kubenswrapper[4965]: I0318 12:01:01.822016 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 18 12:01:01 crc kubenswrapper[4965]: I0318 12:01:01.822058 4965 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="ee73de6fc8900ed347a66c5c1281716c698175d4d7ba52f4424c60f8f2730af0" exitCode=137 Mar 18 12:01:01 crc kubenswrapper[4965]: I0318 12:01:01.822090 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"ee73de6fc8900ed347a66c5c1281716c698175d4d7ba52f4424c60f8f2730af0"} Mar 18 12:01:01 crc kubenswrapper[4965]: I0318 12:01:01.822118 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c4765c2b4a36556380389eac98341b0ac68bafd1963e2f42bc072b2a339c8430"} Mar 18 12:01:01 crc kubenswrapper[4965]: I0318 12:01:01.822135 4965 scope.go:117] "RemoveContainer" containerID="9dd304c138d70e2bab50741090aace5284f04e2d25fe04cf9188a91b6976f720" Mar 18 12:01:02 crc kubenswrapper[4965]: I0318 12:01:02.827480 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 18 12:01:02 crc kubenswrapper[4965]: I0318 12:01:02.828681 4965 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 12:01:05 crc kubenswrapper[4965]: I0318 12:01:05.191648 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:01:11 crc kubenswrapper[4965]: I0318 12:01:11.068720 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:01:11 crc kubenswrapper[4965]: I0318 12:01:11.073179 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:01:11 crc kubenswrapper[4965]: I0318 12:01:11.602326 4965 patch_prober.go:28] interesting pod/machine-config-daemon-8l67n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:01:11 crc kubenswrapper[4965]: I0318 12:01:11.602639 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8l67n" podUID="e9a53215-1d0d-47de-92c4-cea3209fe4fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:01:11 crc kubenswrapper[4965]: I0318 12:01:11.602788 4965 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8l67n" Mar 18 12:01:11 crc kubenswrapper[4965]: I0318 12:01:11.603484 4965 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ee65076d3fbf5f0628fdea60bce51ce496c09b690ec6c4b4248352a70b8fb132"} pod="openshift-machine-config-operator/machine-config-daemon-8l67n" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 12:01:11 crc kubenswrapper[4965]: I0318 12:01:11.603620 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8l67n" podUID="e9a53215-1d0d-47de-92c4-cea3209fe4fa" containerName="machine-config-daemon" containerID="cri-o://ee65076d3fbf5f0628fdea60bce51ce496c09b690ec6c4b4248352a70b8fb132" gracePeriod=600 Mar 18 12:01:11 crc kubenswrapper[4965]: I0318 12:01:11.874056 4965 generic.go:334] "Generic (PLEG): container finished" podID="e9a53215-1d0d-47de-92c4-cea3209fe4fa" containerID="ee65076d3fbf5f0628fdea60bce51ce496c09b690ec6c4b4248352a70b8fb132" exitCode=0 Mar 18 12:01:11 crc kubenswrapper[4965]: I0318 12:01:11.875910 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8l67n" event={"ID":"e9a53215-1d0d-47de-92c4-cea3209fe4fa","Type":"ContainerDied","Data":"ee65076d3fbf5f0628fdea60bce51ce496c09b690ec6c4b4248352a70b8fb132"} Mar 18 12:01:11 crc kubenswrapper[4965]: I0318 12:01:11.875973 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8l67n" event={"ID":"e9a53215-1d0d-47de-92c4-cea3209fe4fa","Type":"ContainerStarted","Data":"2f746ca8b1af1c4a5e8e0ca3b2e3b8e31e737add5de216561fcee0defff4da5c"} Mar 18 12:01:11 crc kubenswrapper[4965]: I0318 12:01:11.880066 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:01:23 crc kubenswrapper[4965]: I0318 12:01:23.711398 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rqmvm"] Mar 18 12:01:23 crc kubenswrapper[4965]: E0318 12:01:23.712080 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f76b91e1-f768-4cb1-857d-6e3eb31e59f6" containerName="extract-utilities" Mar 18 12:01:23 crc kubenswrapper[4965]: I0318 12:01:23.712092 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="f76b91e1-f768-4cb1-857d-6e3eb31e59f6" containerName="extract-utilities" Mar 18 12:01:23 crc kubenswrapper[4965]: E0318 12:01:23.712100 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f56f075-8aa8-4030-849a-ae1fc80135ef" containerName="installer" Mar 18 12:01:23 crc kubenswrapper[4965]: I0318 12:01:23.712106 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f56f075-8aa8-4030-849a-ae1fc80135ef" containerName="installer" Mar 18 12:01:23 crc kubenswrapper[4965]: E0318 12:01:23.712118 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8edec177-0701-4d1c-bb61-33c0a05df51d" containerName="extract-content" Mar 18 12:01:23 crc kubenswrapper[4965]: I0318 12:01:23.712124 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="8edec177-0701-4d1c-bb61-33c0a05df51d" containerName="extract-content" Mar 18 12:01:23 crc kubenswrapper[4965]: E0318 12:01:23.712135 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5" containerName="extract-utilities" Mar 18 12:01:23 crc kubenswrapper[4965]: I0318 12:01:23.712141 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5" containerName="extract-utilities" Mar 18 12:01:23 crc kubenswrapper[4965]: E0318 12:01:23.712149 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3614d69-f3b4-4496-af2f-d119c56de1c7" containerName="extract-utilities" Mar 18 12:01:23 crc kubenswrapper[4965]: I0318 12:01:23.712154 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3614d69-f3b4-4496-af2f-d119c56de1c7" containerName="extract-utilities" Mar 18 12:01:23 crc kubenswrapper[4965]: E0318 12:01:23.712163 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f76b91e1-f768-4cb1-857d-6e3eb31e59f6" containerName="registry-server" Mar 18 12:01:23 crc kubenswrapper[4965]: I0318 12:01:23.712169 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="f76b91e1-f768-4cb1-857d-6e3eb31e59f6" containerName="registry-server" Mar 18 12:01:23 crc kubenswrapper[4965]: E0318 12:01:23.712176 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 12:01:23 crc kubenswrapper[4965]: I0318 12:01:23.712181 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 12:01:23 crc kubenswrapper[4965]: E0318 12:01:23.712189 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5" containerName="registry-server" Mar 18 12:01:23 crc kubenswrapper[4965]: I0318 12:01:23.712194 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5" containerName="registry-server" Mar 18 12:01:23 crc kubenswrapper[4965]: E0318 12:01:23.712203 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3614d69-f3b4-4496-af2f-d119c56de1c7" containerName="registry-server" Mar 18 12:01:23 crc kubenswrapper[4965]: I0318 12:01:23.712209 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3614d69-f3b4-4496-af2f-d119c56de1c7" containerName="registry-server" Mar 18 12:01:23 crc kubenswrapper[4965]: E0318 12:01:23.712219 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8edec177-0701-4d1c-bb61-33c0a05df51d" containerName="extract-utilities" Mar 18 12:01:23 crc kubenswrapper[4965]: I0318 12:01:23.712224 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="8edec177-0701-4d1c-bb61-33c0a05df51d" containerName="extract-utilities" Mar 18 12:01:23 crc kubenswrapper[4965]: E0318 12:01:23.712231 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f76b91e1-f768-4cb1-857d-6e3eb31e59f6" containerName="extract-content" Mar 18 12:01:23 crc kubenswrapper[4965]: I0318 12:01:23.712236 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="f76b91e1-f768-4cb1-857d-6e3eb31e59f6" containerName="extract-content" Mar 18 12:01:23 crc kubenswrapper[4965]: E0318 12:01:23.712244 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="beb408f9-638c-4bd9-b4ec-c72bc43286e3" containerName="marketplace-operator" Mar 18 12:01:23 crc kubenswrapper[4965]: I0318 12:01:23.712249 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="beb408f9-638c-4bd9-b4ec-c72bc43286e3" containerName="marketplace-operator" Mar 18 12:01:23 crc kubenswrapper[4965]: E0318 12:01:23.712258 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3614d69-f3b4-4496-af2f-d119c56de1c7" containerName="extract-content" Mar 18 12:01:23 crc kubenswrapper[4965]: I0318 12:01:23.712263 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3614d69-f3b4-4496-af2f-d119c56de1c7" containerName="extract-content" Mar 18 12:01:23 crc kubenswrapper[4965]: E0318 12:01:23.712270 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8edec177-0701-4d1c-bb61-33c0a05df51d" containerName="registry-server" Mar 18 12:01:23 crc kubenswrapper[4965]: I0318 12:01:23.712276 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="8edec177-0701-4d1c-bb61-33c0a05df51d" containerName="registry-server" Mar 18 12:01:23 crc kubenswrapper[4965]: E0318 12:01:23.712284 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5" containerName="extract-content" Mar 18 12:01:23 crc kubenswrapper[4965]: I0318 12:01:23.712289 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5" containerName="extract-content" Mar 18 12:01:23 crc kubenswrapper[4965]: I0318 12:01:23.712376 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="39a1ff3a-2ec1-4c9b-9569-b3a0a24de4f5" containerName="registry-server" Mar 18 12:01:23 crc kubenswrapper[4965]: I0318 12:01:23.712386 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="8edec177-0701-4d1c-bb61-33c0a05df51d" containerName="registry-server" Mar 18 12:01:23 crc kubenswrapper[4965]: I0318 12:01:23.712398 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="f76b91e1-f768-4cb1-857d-6e3eb31e59f6" containerName="registry-server" Mar 18 12:01:23 crc kubenswrapper[4965]: I0318 12:01:23.712404 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f56f075-8aa8-4030-849a-ae1fc80135ef" containerName="installer" Mar 18 12:01:23 crc kubenswrapper[4965]: I0318 12:01:23.712412 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="beb408f9-638c-4bd9-b4ec-c72bc43286e3" containerName="marketplace-operator" Mar 18 12:01:23 crc kubenswrapper[4965]: I0318 12:01:23.712419 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 12:01:23 crc kubenswrapper[4965]: I0318 12:01:23.712426 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3614d69-f3b4-4496-af2f-d119c56de1c7" containerName="registry-server" Mar 18 12:01:23 crc kubenswrapper[4965]: I0318 12:01:23.713122 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rqmvm" Mar 18 12:01:23 crc kubenswrapper[4965]: I0318 12:01:23.716394 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 12:01:23 crc kubenswrapper[4965]: I0318 12:01:23.716724 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 18 12:01:23 crc kubenswrapper[4965]: I0318 12:01:23.717943 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 18 12:01:23 crc kubenswrapper[4965]: I0318 12:01:23.730940 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rqmvm"] Mar 18 12:01:23 crc kubenswrapper[4965]: I0318 12:01:23.862359 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rddf9\" (UniqueName: \"kubernetes.io/projected/4c6df52f-4c11-418c-9027-30248690ba00-kube-api-access-rddf9\") pod \"certified-operators-rqmvm\" (UID: \"4c6df52f-4c11-418c-9027-30248690ba00\") " pod="openshift-marketplace/certified-operators-rqmvm" Mar 18 12:01:23 crc kubenswrapper[4965]: I0318 12:01:23.862417 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c6df52f-4c11-418c-9027-30248690ba00-catalog-content\") pod \"certified-operators-rqmvm\" (UID: \"4c6df52f-4c11-418c-9027-30248690ba00\") " pod="openshift-marketplace/certified-operators-rqmvm" Mar 18 12:01:23 crc kubenswrapper[4965]: I0318 12:01:23.862459 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c6df52f-4c11-418c-9027-30248690ba00-utilities\") pod \"certified-operators-rqmvm\" (UID: \"4c6df52f-4c11-418c-9027-30248690ba00\") " pod="openshift-marketplace/certified-operators-rqmvm" Mar 18 12:01:23 crc kubenswrapper[4965]: I0318 12:01:23.904847 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-v6sqx"] Mar 18 12:01:23 crc kubenswrapper[4965]: I0318 12:01:23.905832 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v6sqx" Mar 18 12:01:23 crc kubenswrapper[4965]: I0318 12:01:23.907932 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 12:01:23 crc kubenswrapper[4965]: I0318 12:01:23.912088 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v6sqx"] Mar 18 12:01:23 crc kubenswrapper[4965]: I0318 12:01:23.963430 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c6df52f-4c11-418c-9027-30248690ba00-utilities\") pod \"certified-operators-rqmvm\" (UID: \"4c6df52f-4c11-418c-9027-30248690ba00\") " pod="openshift-marketplace/certified-operators-rqmvm" Mar 18 12:01:23 crc kubenswrapper[4965]: I0318 12:01:23.963504 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rddf9\" (UniqueName: \"kubernetes.io/projected/4c6df52f-4c11-418c-9027-30248690ba00-kube-api-access-rddf9\") pod \"certified-operators-rqmvm\" (UID: \"4c6df52f-4c11-418c-9027-30248690ba00\") " pod="openshift-marketplace/certified-operators-rqmvm" Mar 18 12:01:23 crc kubenswrapper[4965]: I0318 12:01:23.963534 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c6df52f-4c11-418c-9027-30248690ba00-catalog-content\") pod \"certified-operators-rqmvm\" (UID: \"4c6df52f-4c11-418c-9027-30248690ba00\") " pod="openshift-marketplace/certified-operators-rqmvm" Mar 18 12:01:23 crc kubenswrapper[4965]: I0318 12:01:23.964026 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c6df52f-4c11-418c-9027-30248690ba00-catalog-content\") pod \"certified-operators-rqmvm\" (UID: \"4c6df52f-4c11-418c-9027-30248690ba00\") " pod="openshift-marketplace/certified-operators-rqmvm" Mar 18 12:01:23 crc kubenswrapper[4965]: I0318 12:01:23.964067 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c6df52f-4c11-418c-9027-30248690ba00-utilities\") pod \"certified-operators-rqmvm\" (UID: \"4c6df52f-4c11-418c-9027-30248690ba00\") " pod="openshift-marketplace/certified-operators-rqmvm" Mar 18 12:01:23 crc kubenswrapper[4965]: I0318 12:01:23.983007 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rddf9\" (UniqueName: \"kubernetes.io/projected/4c6df52f-4c11-418c-9027-30248690ba00-kube-api-access-rddf9\") pod \"certified-operators-rqmvm\" (UID: \"4c6df52f-4c11-418c-9027-30248690ba00\") " pod="openshift-marketplace/certified-operators-rqmvm" Mar 18 12:01:24 crc kubenswrapper[4965]: I0318 12:01:24.031156 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rqmvm" Mar 18 12:01:24 crc kubenswrapper[4965]: I0318 12:01:24.065082 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7b87de4-ba72-4fb3-a94e-716b41d7881f-catalog-content\") pod \"community-operators-v6sqx\" (UID: \"e7b87de4-ba72-4fb3-a94e-716b41d7881f\") " pod="openshift-marketplace/community-operators-v6sqx" Mar 18 12:01:24 crc kubenswrapper[4965]: I0318 12:01:24.065416 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjfcf\" (UniqueName: \"kubernetes.io/projected/e7b87de4-ba72-4fb3-a94e-716b41d7881f-kube-api-access-pjfcf\") pod \"community-operators-v6sqx\" (UID: \"e7b87de4-ba72-4fb3-a94e-716b41d7881f\") " pod="openshift-marketplace/community-operators-v6sqx" Mar 18 12:01:24 crc kubenswrapper[4965]: I0318 12:01:24.065577 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7b87de4-ba72-4fb3-a94e-716b41d7881f-utilities\") pod \"community-operators-v6sqx\" (UID: \"e7b87de4-ba72-4fb3-a94e-716b41d7881f\") " pod="openshift-marketplace/community-operators-v6sqx" Mar 18 12:01:24 crc kubenswrapper[4965]: I0318 12:01:24.166467 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7b87de4-ba72-4fb3-a94e-716b41d7881f-catalog-content\") pod \"community-operators-v6sqx\" (UID: \"e7b87de4-ba72-4fb3-a94e-716b41d7881f\") " pod="openshift-marketplace/community-operators-v6sqx" Mar 18 12:01:24 crc kubenswrapper[4965]: I0318 12:01:24.166889 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjfcf\" (UniqueName: \"kubernetes.io/projected/e7b87de4-ba72-4fb3-a94e-716b41d7881f-kube-api-access-pjfcf\") pod \"community-operators-v6sqx\" (UID: \"e7b87de4-ba72-4fb3-a94e-716b41d7881f\") " pod="openshift-marketplace/community-operators-v6sqx" Mar 18 12:01:24 crc kubenswrapper[4965]: I0318 12:01:24.166923 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7b87de4-ba72-4fb3-a94e-716b41d7881f-utilities\") pod \"community-operators-v6sqx\" (UID: \"e7b87de4-ba72-4fb3-a94e-716b41d7881f\") " pod="openshift-marketplace/community-operators-v6sqx" Mar 18 12:01:24 crc kubenswrapper[4965]: I0318 12:01:24.167424 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7b87de4-ba72-4fb3-a94e-716b41d7881f-utilities\") pod \"community-operators-v6sqx\" (UID: \"e7b87de4-ba72-4fb3-a94e-716b41d7881f\") " pod="openshift-marketplace/community-operators-v6sqx" Mar 18 12:01:24 crc kubenswrapper[4965]: I0318 12:01:24.167433 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7b87de4-ba72-4fb3-a94e-716b41d7881f-catalog-content\") pod \"community-operators-v6sqx\" (UID: \"e7b87de4-ba72-4fb3-a94e-716b41d7881f\") " pod="openshift-marketplace/community-operators-v6sqx" Mar 18 12:01:24 crc kubenswrapper[4965]: I0318 12:01:24.200573 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjfcf\" (UniqueName: \"kubernetes.io/projected/e7b87de4-ba72-4fb3-a94e-716b41d7881f-kube-api-access-pjfcf\") pod \"community-operators-v6sqx\" (UID: \"e7b87de4-ba72-4fb3-a94e-716b41d7881f\") " pod="openshift-marketplace/community-operators-v6sqx" Mar 18 12:01:24 crc kubenswrapper[4965]: I0318 12:01:24.262616 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v6sqx" Mar 18 12:01:24 crc kubenswrapper[4965]: I0318 12:01:24.441383 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rqmvm"] Mar 18 12:01:24 crc kubenswrapper[4965]: W0318 12:01:24.447282 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c6df52f_4c11_418c_9027_30248690ba00.slice/crio-563cb303cc7b89239b6c68a866e63a7c635aef3c17b2b958e8e1bc0b20c18f1f WatchSource:0}: Error finding container 563cb303cc7b89239b6c68a866e63a7c635aef3c17b2b958e8e1bc0b20c18f1f: Status 404 returned error can't find the container with id 563cb303cc7b89239b6c68a866e63a7c635aef3c17b2b958e8e1bc0b20c18f1f Mar 18 12:01:24 crc kubenswrapper[4965]: I0318 12:01:24.672951 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v6sqx"] Mar 18 12:01:24 crc kubenswrapper[4965]: I0318 12:01:24.971824 4965 generic.go:334] "Generic (PLEG): container finished" podID="4c6df52f-4c11-418c-9027-30248690ba00" containerID="c507eb3acd8c010e1fb68b5b661e7ec51c0e8c1d3ee8e0ad122c954dfa517ec9" exitCode=0 Mar 18 12:01:24 crc kubenswrapper[4965]: I0318 12:01:24.971888 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqmvm" event={"ID":"4c6df52f-4c11-418c-9027-30248690ba00","Type":"ContainerDied","Data":"c507eb3acd8c010e1fb68b5b661e7ec51c0e8c1d3ee8e0ad122c954dfa517ec9"} Mar 18 12:01:24 crc kubenswrapper[4965]: I0318 12:01:24.971939 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqmvm" event={"ID":"4c6df52f-4c11-418c-9027-30248690ba00","Type":"ContainerStarted","Data":"563cb303cc7b89239b6c68a866e63a7c635aef3c17b2b958e8e1bc0b20c18f1f"} Mar 18 12:01:24 crc kubenswrapper[4965]: I0318 12:01:24.973153 4965 generic.go:334] "Generic (PLEG): container finished" podID="e7b87de4-ba72-4fb3-a94e-716b41d7881f" containerID="92fcd9a835dd1d9709e8d124104cf67b7aed1ad82baad1d900ddae5a0bde8ee2" exitCode=0 Mar 18 12:01:24 crc kubenswrapper[4965]: I0318 12:01:24.973176 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v6sqx" event={"ID":"e7b87de4-ba72-4fb3-a94e-716b41d7881f","Type":"ContainerDied","Data":"92fcd9a835dd1d9709e8d124104cf67b7aed1ad82baad1d900ddae5a0bde8ee2"} Mar 18 12:01:24 crc kubenswrapper[4965]: I0318 12:01:24.973198 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v6sqx" event={"ID":"e7b87de4-ba72-4fb3-a94e-716b41d7881f","Type":"ContainerStarted","Data":"e1c1c5762597087d99959cf741af7683ba2acbb2dc4b7ecf5fa3ae346b837217"} Mar 18 12:01:25 crc kubenswrapper[4965]: I0318 12:01:25.982548 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v6sqx" event={"ID":"e7b87de4-ba72-4fb3-a94e-716b41d7881f","Type":"ContainerStarted","Data":"c8c147c41cf01ca758e4ca3e17dd0413589c653b2e9c79bdcfd1247fc432e845"} Mar 18 12:01:26 crc kubenswrapper[4965]: I0318 12:01:26.108851 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nll29"] Mar 18 12:01:26 crc kubenswrapper[4965]: I0318 12:01:26.110081 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nll29" Mar 18 12:01:26 crc kubenswrapper[4965]: I0318 12:01:26.112137 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 12:01:26 crc kubenswrapper[4965]: I0318 12:01:26.120689 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nll29"] Mar 18 12:01:26 crc kubenswrapper[4965]: I0318 12:01:26.291135 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b28d189-a695-4fa9-a184-1a5515d68729-catalog-content\") pod \"redhat-marketplace-nll29\" (UID: \"6b28d189-a695-4fa9-a184-1a5515d68729\") " pod="openshift-marketplace/redhat-marketplace-nll29" Mar 18 12:01:26 crc kubenswrapper[4965]: I0318 12:01:26.291240 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b28d189-a695-4fa9-a184-1a5515d68729-utilities\") pod \"redhat-marketplace-nll29\" (UID: \"6b28d189-a695-4fa9-a184-1a5515d68729\") " pod="openshift-marketplace/redhat-marketplace-nll29" Mar 18 12:01:26 crc kubenswrapper[4965]: I0318 12:01:26.291277 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjxhc\" (UniqueName: \"kubernetes.io/projected/6b28d189-a695-4fa9-a184-1a5515d68729-kube-api-access-rjxhc\") pod \"redhat-marketplace-nll29\" (UID: \"6b28d189-a695-4fa9-a184-1a5515d68729\") " pod="openshift-marketplace/redhat-marketplace-nll29" Mar 18 12:01:26 crc kubenswrapper[4965]: I0318 12:01:26.306968 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hkp58"] Mar 18 12:01:26 crc kubenswrapper[4965]: I0318 12:01:26.307986 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hkp58" Mar 18 12:01:26 crc kubenswrapper[4965]: I0318 12:01:26.310892 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 12:01:26 crc kubenswrapper[4965]: I0318 12:01:26.316916 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hkp58"] Mar 18 12:01:26 crc kubenswrapper[4965]: I0318 12:01:26.392313 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24d57\" (UniqueName: \"kubernetes.io/projected/46ebfdaa-4889-4dda-849c-58e0c672d243-kube-api-access-24d57\") pod \"redhat-operators-hkp58\" (UID: \"46ebfdaa-4889-4dda-849c-58e0c672d243\") " pod="openshift-marketplace/redhat-operators-hkp58" Mar 18 12:01:26 crc kubenswrapper[4965]: I0318 12:01:26.392369 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b28d189-a695-4fa9-a184-1a5515d68729-utilities\") pod \"redhat-marketplace-nll29\" (UID: \"6b28d189-a695-4fa9-a184-1a5515d68729\") " pod="openshift-marketplace/redhat-marketplace-nll29" Mar 18 12:01:26 crc kubenswrapper[4965]: I0318 12:01:26.392407 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjxhc\" (UniqueName: \"kubernetes.io/projected/6b28d189-a695-4fa9-a184-1a5515d68729-kube-api-access-rjxhc\") pod \"redhat-marketplace-nll29\" (UID: \"6b28d189-a695-4fa9-a184-1a5515d68729\") " pod="openshift-marketplace/redhat-marketplace-nll29" Mar 18 12:01:26 crc kubenswrapper[4965]: I0318 12:01:26.392447 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46ebfdaa-4889-4dda-849c-58e0c672d243-utilities\") pod \"redhat-operators-hkp58\" (UID: \"46ebfdaa-4889-4dda-849c-58e0c672d243\") " pod="openshift-marketplace/redhat-operators-hkp58" Mar 18 12:01:26 crc kubenswrapper[4965]: I0318 12:01:26.392484 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b28d189-a695-4fa9-a184-1a5515d68729-catalog-content\") pod \"redhat-marketplace-nll29\" (UID: \"6b28d189-a695-4fa9-a184-1a5515d68729\") " pod="openshift-marketplace/redhat-marketplace-nll29" Mar 18 12:01:26 crc kubenswrapper[4965]: I0318 12:01:26.392525 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46ebfdaa-4889-4dda-849c-58e0c672d243-catalog-content\") pod \"redhat-operators-hkp58\" (UID: \"46ebfdaa-4889-4dda-849c-58e0c672d243\") " pod="openshift-marketplace/redhat-operators-hkp58" Mar 18 12:01:26 crc kubenswrapper[4965]: I0318 12:01:26.393471 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b28d189-a695-4fa9-a184-1a5515d68729-utilities\") pod \"redhat-marketplace-nll29\" (UID: \"6b28d189-a695-4fa9-a184-1a5515d68729\") " pod="openshift-marketplace/redhat-marketplace-nll29" Mar 18 12:01:26 crc kubenswrapper[4965]: I0318 12:01:26.393481 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b28d189-a695-4fa9-a184-1a5515d68729-catalog-content\") pod \"redhat-marketplace-nll29\" (UID: \"6b28d189-a695-4fa9-a184-1a5515d68729\") " pod="openshift-marketplace/redhat-marketplace-nll29" Mar 18 12:01:26 crc kubenswrapper[4965]: I0318 12:01:26.415007 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjxhc\" (UniqueName: \"kubernetes.io/projected/6b28d189-a695-4fa9-a184-1a5515d68729-kube-api-access-rjxhc\") pod \"redhat-marketplace-nll29\" (UID: \"6b28d189-a695-4fa9-a184-1a5515d68729\") " pod="openshift-marketplace/redhat-marketplace-nll29" Mar 18 12:01:26 crc kubenswrapper[4965]: I0318 12:01:26.435447 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nll29" Mar 18 12:01:26 crc kubenswrapper[4965]: I0318 12:01:26.493370 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46ebfdaa-4889-4dda-849c-58e0c672d243-catalog-content\") pod \"redhat-operators-hkp58\" (UID: \"46ebfdaa-4889-4dda-849c-58e0c672d243\") " pod="openshift-marketplace/redhat-operators-hkp58" Mar 18 12:01:26 crc kubenswrapper[4965]: I0318 12:01:26.493426 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24d57\" (UniqueName: \"kubernetes.io/projected/46ebfdaa-4889-4dda-849c-58e0c672d243-kube-api-access-24d57\") pod \"redhat-operators-hkp58\" (UID: \"46ebfdaa-4889-4dda-849c-58e0c672d243\") " pod="openshift-marketplace/redhat-operators-hkp58" Mar 18 12:01:26 crc kubenswrapper[4965]: I0318 12:01:26.493465 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46ebfdaa-4889-4dda-849c-58e0c672d243-utilities\") pod \"redhat-operators-hkp58\" (UID: \"46ebfdaa-4889-4dda-849c-58e0c672d243\") " pod="openshift-marketplace/redhat-operators-hkp58" Mar 18 12:01:26 crc kubenswrapper[4965]: I0318 12:01:26.493954 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46ebfdaa-4889-4dda-849c-58e0c672d243-utilities\") pod \"redhat-operators-hkp58\" (UID: \"46ebfdaa-4889-4dda-849c-58e0c672d243\") " pod="openshift-marketplace/redhat-operators-hkp58" Mar 18 12:01:26 crc kubenswrapper[4965]: I0318 12:01:26.494085 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46ebfdaa-4889-4dda-849c-58e0c672d243-catalog-content\") pod \"redhat-operators-hkp58\" (UID: \"46ebfdaa-4889-4dda-849c-58e0c672d243\") " pod="openshift-marketplace/redhat-operators-hkp58" Mar 18 12:01:26 crc kubenswrapper[4965]: I0318 12:01:26.522611 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24d57\" (UniqueName: \"kubernetes.io/projected/46ebfdaa-4889-4dda-849c-58e0c672d243-kube-api-access-24d57\") pod \"redhat-operators-hkp58\" (UID: \"46ebfdaa-4889-4dda-849c-58e0c672d243\") " pod="openshift-marketplace/redhat-operators-hkp58" Mar 18 12:01:26 crc kubenswrapper[4965]: I0318 12:01:26.664161 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hkp58" Mar 18 12:01:26 crc kubenswrapper[4965]: I0318 12:01:26.670196 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nll29"] Mar 18 12:01:26 crc kubenswrapper[4965]: W0318 12:01:26.728176 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b28d189_a695_4fa9_a184_1a5515d68729.slice/crio-28763ef7707252523683e3ff9f2816dcf404134d3b2c986d58686ef2cbedb141 WatchSource:0}: Error finding container 28763ef7707252523683e3ff9f2816dcf404134d3b2c986d58686ef2cbedb141: Status 404 returned error can't find the container with id 28763ef7707252523683e3ff9f2816dcf404134d3b2c986d58686ef2cbedb141 Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.002830 4965 generic.go:334] "Generic (PLEG): container finished" podID="e7b87de4-ba72-4fb3-a94e-716b41d7881f" containerID="c8c147c41cf01ca758e4ca3e17dd0413589c653b2e9c79bdcfd1247fc432e845" exitCode=0 Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.003993 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v6sqx" event={"ID":"e7b87de4-ba72-4fb3-a94e-716b41d7881f","Type":"ContainerDied","Data":"c8c147c41cf01ca758e4ca3e17dd0413589c653b2e9c79bdcfd1247fc432e845"} Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.006087 4965 generic.go:334] "Generic (PLEG): container finished" podID="4c6df52f-4c11-418c-9027-30248690ba00" containerID="5b3e0fc73a49294761ba7c02a544ccd9e801446b6a32d06aa2276e276a0cb068" exitCode=0 Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.006201 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqmvm" event={"ID":"4c6df52f-4c11-418c-9027-30248690ba00","Type":"ContainerDied","Data":"5b3e0fc73a49294761ba7c02a544ccd9e801446b6a32d06aa2276e276a0cb068"} Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.009277 4965 generic.go:334] "Generic (PLEG): container finished" podID="6b28d189-a695-4fa9-a184-1a5515d68729" containerID="7ccbd2b9f5b8f5348f5f56f2cdc91459b326322edb0c26d4993bf4de7319280b" exitCode=0 Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.009325 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nll29" event={"ID":"6b28d189-a695-4fa9-a184-1a5515d68729","Type":"ContainerDied","Data":"7ccbd2b9f5b8f5348f5f56f2cdc91459b326322edb0c26d4993bf4de7319280b"} Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.009348 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nll29" event={"ID":"6b28d189-a695-4fa9-a184-1a5515d68729","Type":"ContainerStarted","Data":"28763ef7707252523683e3ff9f2816dcf404134d3b2c986d58686ef2cbedb141"} Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.066594 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hkp58"] Mar 18 12:01:27 crc kubenswrapper[4965]: W0318 12:01:27.067747 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46ebfdaa_4889_4dda_849c_58e0c672d243.slice/crio-de6db1a21fa91ed854438e9ad94986f70117db1ab04476684177b5a6a513b2a3 WatchSource:0}: Error finding container de6db1a21fa91ed854438e9ad94986f70117db1ab04476684177b5a6a513b2a3: Status 404 returned error can't find the container with id de6db1a21fa91ed854438e9ad94986f70117db1ab04476684177b5a6a513b2a3 Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.351678 4965 csr.go:257] certificate signing request csr-mrjw6 is issued Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.447856 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563920-x7krv"] Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.451145 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-x7krv" Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.460074 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.460371 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.484750 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563920-x7krv"] Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.488505 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zwf4t"] Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.489370 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zwf4t" Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.494334 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.494519 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.505704 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.517840 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zwf4t"] Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.543715 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563920-8l9zs"] Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.544527 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563920-8l9zs" Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.551141 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.551380 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.563177 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-zz9vs" Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.568667 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563920-8l9zs"] Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.610846 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzrvm\" (UniqueName: \"kubernetes.io/projected/3e37eff5-3e56-4a7b-bf6f-2eb636f543ea-kube-api-access-zzrvm\") pod \"collect-profiles-29563920-x7krv\" (UID: \"3e37eff5-3e56-4a7b-bf6f-2eb636f543ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-x7krv" Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.610890 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4a561e1d-b979-42db-b58e-f983b79b7217-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zwf4t\" (UID: \"4a561e1d-b979-42db-b58e-f983b79b7217\") " pod="openshift-marketplace/marketplace-operator-79b997595-zwf4t" Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.610928 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fgz7\" (UniqueName: \"kubernetes.io/projected/4a561e1d-b979-42db-b58e-f983b79b7217-kube-api-access-7fgz7\") pod \"marketplace-operator-79b997595-zwf4t\" (UID: \"4a561e1d-b979-42db-b58e-f983b79b7217\") " pod="openshift-marketplace/marketplace-operator-79b997595-zwf4t" Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.610961 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4a561e1d-b979-42db-b58e-f983b79b7217-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zwf4t\" (UID: \"4a561e1d-b979-42db-b58e-f983b79b7217\") " pod="openshift-marketplace/marketplace-operator-79b997595-zwf4t" Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.610994 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e37eff5-3e56-4a7b-bf6f-2eb636f543ea-config-volume\") pod \"collect-profiles-29563920-x7krv\" (UID: \"3e37eff5-3e56-4a7b-bf6f-2eb636f543ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-x7krv" Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.611031 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e37eff5-3e56-4a7b-bf6f-2eb636f543ea-secret-volume\") pod \"collect-profiles-29563920-x7krv\" (UID: \"3e37eff5-3e56-4a7b-bf6f-2eb636f543ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-x7krv" Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.611071 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzmrh\" (UniqueName: \"kubernetes.io/projected/b905f912-a71c-4b94-b3d0-5e7d1c0f3447-kube-api-access-gzmrh\") pod \"auto-csr-approver-29563920-8l9zs\" (UID: \"b905f912-a71c-4b94-b3d0-5e7d1c0f3447\") " pod="openshift-infra/auto-csr-approver-29563920-8l9zs" Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.711990 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzmrh\" (UniqueName: \"kubernetes.io/projected/b905f912-a71c-4b94-b3d0-5e7d1c0f3447-kube-api-access-gzmrh\") pod \"auto-csr-approver-29563920-8l9zs\" (UID: \"b905f912-a71c-4b94-b3d0-5e7d1c0f3447\") " pod="openshift-infra/auto-csr-approver-29563920-8l9zs" Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.712088 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzrvm\" (UniqueName: \"kubernetes.io/projected/3e37eff5-3e56-4a7b-bf6f-2eb636f543ea-kube-api-access-zzrvm\") pod \"collect-profiles-29563920-x7krv\" (UID: \"3e37eff5-3e56-4a7b-bf6f-2eb636f543ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-x7krv" Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.712118 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4a561e1d-b979-42db-b58e-f983b79b7217-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zwf4t\" (UID: \"4a561e1d-b979-42db-b58e-f983b79b7217\") " pod="openshift-marketplace/marketplace-operator-79b997595-zwf4t" Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.712336 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fgz7\" (UniqueName: \"kubernetes.io/projected/4a561e1d-b979-42db-b58e-f983b79b7217-kube-api-access-7fgz7\") pod \"marketplace-operator-79b997595-zwf4t\" (UID: \"4a561e1d-b979-42db-b58e-f983b79b7217\") " pod="openshift-marketplace/marketplace-operator-79b997595-zwf4t" Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.713951 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4a561e1d-b979-42db-b58e-f983b79b7217-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zwf4t\" (UID: \"4a561e1d-b979-42db-b58e-f983b79b7217\") " pod="openshift-marketplace/marketplace-operator-79b997595-zwf4t" Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.713987 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e37eff5-3e56-4a7b-bf6f-2eb636f543ea-config-volume\") pod \"collect-profiles-29563920-x7krv\" (UID: \"3e37eff5-3e56-4a7b-bf6f-2eb636f543ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-x7krv" Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.713896 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4a561e1d-b979-42db-b58e-f983b79b7217-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zwf4t\" (UID: \"4a561e1d-b979-42db-b58e-f983b79b7217\") " pod="openshift-marketplace/marketplace-operator-79b997595-zwf4t" Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.714020 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e37eff5-3e56-4a7b-bf6f-2eb636f543ea-secret-volume\") pod \"collect-profiles-29563920-x7krv\" (UID: \"3e37eff5-3e56-4a7b-bf6f-2eb636f543ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-x7krv" Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.714636 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e37eff5-3e56-4a7b-bf6f-2eb636f543ea-config-volume\") pod \"collect-profiles-29563920-x7krv\" (UID: \"3e37eff5-3e56-4a7b-bf6f-2eb636f543ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-x7krv" Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.726001 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4a561e1d-b979-42db-b58e-f983b79b7217-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zwf4t\" (UID: \"4a561e1d-b979-42db-b58e-f983b79b7217\") " pod="openshift-marketplace/marketplace-operator-79b997595-zwf4t" Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.726973 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e37eff5-3e56-4a7b-bf6f-2eb636f543ea-secret-volume\") pod \"collect-profiles-29563920-x7krv\" (UID: \"3e37eff5-3e56-4a7b-bf6f-2eb636f543ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-x7krv" Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.736262 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fgz7\" (UniqueName: \"kubernetes.io/projected/4a561e1d-b979-42db-b58e-f983b79b7217-kube-api-access-7fgz7\") pod \"marketplace-operator-79b997595-zwf4t\" (UID: \"4a561e1d-b979-42db-b58e-f983b79b7217\") " pod="openshift-marketplace/marketplace-operator-79b997595-zwf4t" Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.739824 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzrvm\" (UniqueName: \"kubernetes.io/projected/3e37eff5-3e56-4a7b-bf6f-2eb636f543ea-kube-api-access-zzrvm\") pod \"collect-profiles-29563920-x7krv\" (UID: \"3e37eff5-3e56-4a7b-bf6f-2eb636f543ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-x7krv" Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.740401 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzmrh\" (UniqueName: \"kubernetes.io/projected/b905f912-a71c-4b94-b3d0-5e7d1c0f3447-kube-api-access-gzmrh\") pod \"auto-csr-approver-29563920-8l9zs\" (UID: \"b905f912-a71c-4b94-b3d0-5e7d1c0f3447\") " pod="openshift-infra/auto-csr-approver-29563920-8l9zs" Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.795147 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-x7krv" Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.823169 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zwf4t" Mar 18 12:01:27 crc kubenswrapper[4965]: I0318 12:01:27.896819 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563920-8l9zs" Mar 18 12:01:28 crc kubenswrapper[4965]: I0318 12:01:28.029446 4965 generic.go:334] "Generic (PLEG): container finished" podID="46ebfdaa-4889-4dda-849c-58e0c672d243" containerID="c9793e47cc4a8d1a28225068756ede4d42d2ee00aff1e74f4bfc0959eb89a184" exitCode=0 Mar 18 12:01:28 crc kubenswrapper[4965]: I0318 12:01:28.047549 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v6sqx" event={"ID":"e7b87de4-ba72-4fb3-a94e-716b41d7881f","Type":"ContainerStarted","Data":"b3979c62c72899d8311a4df7b9c3f8bcbc6dfd9e05039966dde38b8972fa322b"} Mar 18 12:01:28 crc kubenswrapper[4965]: I0318 12:01:28.047620 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hkp58" event={"ID":"46ebfdaa-4889-4dda-849c-58e0c672d243","Type":"ContainerDied","Data":"c9793e47cc4a8d1a28225068756ede4d42d2ee00aff1e74f4bfc0959eb89a184"} Mar 18 12:01:28 crc kubenswrapper[4965]: I0318 12:01:28.047639 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hkp58" event={"ID":"46ebfdaa-4889-4dda-849c-58e0c672d243","Type":"ContainerStarted","Data":"de6db1a21fa91ed854438e9ad94986f70117db1ab04476684177b5a6a513b2a3"} Mar 18 12:01:28 crc kubenswrapper[4965]: I0318 12:01:28.056816 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-v6sqx" podStartSLOduration=2.627295086 podStartE2EDuration="5.056794928s" podCreationTimestamp="2026-03-18 12:01:23 +0000 UTC" firstStartedPulling="2026-03-18 12:01:24.977509852 +0000 UTC m=+289.963697331" lastFinishedPulling="2026-03-18 12:01:27.407009684 +0000 UTC m=+292.393197173" observedRunningTime="2026-03-18 12:01:28.048107576 +0000 UTC m=+293.034295055" watchObservedRunningTime="2026-03-18 12:01:28.056794928 +0000 UTC m=+293.042982397" Mar 18 12:01:28 crc kubenswrapper[4965]: I0318 12:01:28.250137 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563920-x7krv"] Mar 18 12:01:28 crc kubenswrapper[4965]: W0318 12:01:28.251519 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e37eff5_3e56_4a7b_bf6f_2eb636f543ea.slice/crio-07edb7d1a378f4b1abaa5299df5df6d9a224af8f978eacc5891dd2bafa12a08d WatchSource:0}: Error finding container 07edb7d1a378f4b1abaa5299df5df6d9a224af8f978eacc5891dd2bafa12a08d: Status 404 returned error can't find the container with id 07edb7d1a378f4b1abaa5299df5df6d9a224af8f978eacc5891dd2bafa12a08d Mar 18 12:01:28 crc kubenswrapper[4965]: I0318 12:01:28.319257 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zwf4t"] Mar 18 12:01:28 crc kubenswrapper[4965]: W0318 12:01:28.336615 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a561e1d_b979_42db_b58e_f983b79b7217.slice/crio-7878274e6fd86f736b967cdf287e43758db8ccb590b2024f6a94e627afdda291 WatchSource:0}: Error finding container 7878274e6fd86f736b967cdf287e43758db8ccb590b2024f6a94e627afdda291: Status 404 returned error can't find the container with id 7878274e6fd86f736b967cdf287e43758db8ccb590b2024f6a94e627afdda291 Mar 18 12:01:28 crc kubenswrapper[4965]: I0318 12:01:28.352739 4965 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-26 12:50:17.883375598 +0000 UTC Mar 18 12:01:28 crc kubenswrapper[4965]: I0318 12:01:28.352781 4965 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6072h48m49.530599849s for next certificate rotation Mar 18 12:01:28 crc kubenswrapper[4965]: I0318 12:01:28.413373 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563920-8l9zs"] Mar 18 12:01:28 crc kubenswrapper[4965]: W0318 12:01:28.417104 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb905f912_a71c_4b94_b3d0_5e7d1c0f3447.slice/crio-e74d79d3ea971cb2d0b9df47d041687cb5dc29f42e6742e8bff758aad824d400 WatchSource:0}: Error finding container e74d79d3ea971cb2d0b9df47d041687cb5dc29f42e6742e8bff758aad824d400: Status 404 returned error can't find the container with id e74d79d3ea971cb2d0b9df47d041687cb5dc29f42e6742e8bff758aad824d400 Mar 18 12:01:29 crc kubenswrapper[4965]: I0318 12:01:29.039542 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqmvm" event={"ID":"4c6df52f-4c11-418c-9027-30248690ba00","Type":"ContainerStarted","Data":"d063dbe1a5b335e337765d5f392ae752feb28d1111485b6362d65db2b4a6409f"} Mar 18 12:01:29 crc kubenswrapper[4965]: I0318 12:01:29.042229 4965 generic.go:334] "Generic (PLEG): container finished" podID="6b28d189-a695-4fa9-a184-1a5515d68729" containerID="aa992b72267b1fbd38eebec220496070537f1b41e61574ed3b319b0778058d66" exitCode=0 Mar 18 12:01:29 crc kubenswrapper[4965]: I0318 12:01:29.042296 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nll29" event={"ID":"6b28d189-a695-4fa9-a184-1a5515d68729","Type":"ContainerDied","Data":"aa992b72267b1fbd38eebec220496070537f1b41e61574ed3b319b0778058d66"} Mar 18 12:01:29 crc kubenswrapper[4965]: I0318 12:01:29.044335 4965 generic.go:334] "Generic (PLEG): container finished" podID="3e37eff5-3e56-4a7b-bf6f-2eb636f543ea" containerID="059c27e18da6a4d3466e6f90bc42f4a03022a4a6cba45af618fa739a40ea371f" exitCode=0 Mar 18 12:01:29 crc kubenswrapper[4965]: I0318 12:01:29.044388 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-x7krv" event={"ID":"3e37eff5-3e56-4a7b-bf6f-2eb636f543ea","Type":"ContainerDied","Data":"059c27e18da6a4d3466e6f90bc42f4a03022a4a6cba45af618fa739a40ea371f"} Mar 18 12:01:29 crc kubenswrapper[4965]: I0318 12:01:29.044406 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-x7krv" event={"ID":"3e37eff5-3e56-4a7b-bf6f-2eb636f543ea","Type":"ContainerStarted","Data":"07edb7d1a378f4b1abaa5299df5df6d9a224af8f978eacc5891dd2bafa12a08d"} Mar 18 12:01:29 crc kubenswrapper[4965]: I0318 12:01:29.060974 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rqmvm" podStartSLOduration=2.549004826 podStartE2EDuration="6.060954923s" podCreationTimestamp="2026-03-18 12:01:23 +0000 UTC" firstStartedPulling="2026-03-18 12:01:24.973066108 +0000 UTC m=+289.959253597" lastFinishedPulling="2026-03-18 12:01:28.485016215 +0000 UTC m=+293.471203694" observedRunningTime="2026-03-18 12:01:29.059358025 +0000 UTC m=+294.045545524" watchObservedRunningTime="2026-03-18 12:01:29.060954923 +0000 UTC m=+294.047142402" Mar 18 12:01:29 crc kubenswrapper[4965]: I0318 12:01:29.069043 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563920-8l9zs" event={"ID":"b905f912-a71c-4b94-b3d0-5e7d1c0f3447","Type":"ContainerStarted","Data":"e74d79d3ea971cb2d0b9df47d041687cb5dc29f42e6742e8bff758aad824d400"} Mar 18 12:01:29 crc kubenswrapper[4965]: I0318 12:01:29.079204 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hkp58" event={"ID":"46ebfdaa-4889-4dda-849c-58e0c672d243","Type":"ContainerStarted","Data":"9e61f981c50eb64726a67e1c8d4219c92987752c3fe129fe8e49c66d3eb4c0d3"} Mar 18 12:01:29 crc kubenswrapper[4965]: I0318 12:01:29.083949 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zwf4t" event={"ID":"4a561e1d-b979-42db-b58e-f983b79b7217","Type":"ContainerStarted","Data":"c33c62c484a3ece40d1549cf4dd2846cdbfcf6192f1453b8a1cfcff10611fafb"} Mar 18 12:01:29 crc kubenswrapper[4965]: I0318 12:01:29.084002 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zwf4t" event={"ID":"4a561e1d-b979-42db-b58e-f983b79b7217","Type":"ContainerStarted","Data":"7878274e6fd86f736b967cdf287e43758db8ccb590b2024f6a94e627afdda291"} Mar 18 12:01:29 crc kubenswrapper[4965]: I0318 12:01:29.142913 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-zwf4t" podStartSLOduration=2.142896543 podStartE2EDuration="2.142896543s" podCreationTimestamp="2026-03-18 12:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:01:29.140018866 +0000 UTC m=+294.126206345" watchObservedRunningTime="2026-03-18 12:01:29.142896543 +0000 UTC m=+294.129084022" Mar 18 12:01:29 crc kubenswrapper[4965]: I0318 12:01:29.353122 4965 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-01 18:49:05.468010151 +0000 UTC Mar 18 12:01:29 crc kubenswrapper[4965]: I0318 12:01:29.353182 4965 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6942h47m36.114831261s for next certificate rotation Mar 18 12:01:30 crc kubenswrapper[4965]: I0318 12:01:30.091495 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nll29" event={"ID":"6b28d189-a695-4fa9-a184-1a5515d68729","Type":"ContainerStarted","Data":"50f166f2d6f05c360f8388b21ec6331c7e06af7e9fca83977ee975e929a6829f"} Mar 18 12:01:30 crc kubenswrapper[4965]: I0318 12:01:30.094074 4965 generic.go:334] "Generic (PLEG): container finished" podID="46ebfdaa-4889-4dda-849c-58e0c672d243" containerID="9e61f981c50eb64726a67e1c8d4219c92987752c3fe129fe8e49c66d3eb4c0d3" exitCode=0 Mar 18 12:01:30 crc kubenswrapper[4965]: I0318 12:01:30.094193 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hkp58" event={"ID":"46ebfdaa-4889-4dda-849c-58e0c672d243","Type":"ContainerDied","Data":"9e61f981c50eb64726a67e1c8d4219c92987752c3fe129fe8e49c66d3eb4c0d3"} Mar 18 12:01:30 crc kubenswrapper[4965]: I0318 12:01:30.094960 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-zwf4t" Mar 18 12:01:30 crc kubenswrapper[4965]: I0318 12:01:30.097520 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-zwf4t" Mar 18 12:01:30 crc kubenswrapper[4965]: I0318 12:01:30.119736 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nll29" podStartSLOduration=1.512456952 podStartE2EDuration="4.119719362s" podCreationTimestamp="2026-03-18 12:01:26 +0000 UTC" firstStartedPulling="2026-03-18 12:01:27.010409651 +0000 UTC m=+291.996597130" lastFinishedPulling="2026-03-18 12:01:29.617672061 +0000 UTC m=+294.603859540" observedRunningTime="2026-03-18 12:01:30.115697221 +0000 UTC m=+295.101884700" watchObservedRunningTime="2026-03-18 12:01:30.119719362 +0000 UTC m=+295.105906841" Mar 18 12:01:30 crc kubenswrapper[4965]: I0318 12:01:30.431028 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-x7krv" Mar 18 12:01:30 crc kubenswrapper[4965]: I0318 12:01:30.555648 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e37eff5-3e56-4a7b-bf6f-2eb636f543ea-config-volume\") pod \"3e37eff5-3e56-4a7b-bf6f-2eb636f543ea\" (UID: \"3e37eff5-3e56-4a7b-bf6f-2eb636f543ea\") " Mar 18 12:01:30 crc kubenswrapper[4965]: I0318 12:01:30.556025 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e37eff5-3e56-4a7b-bf6f-2eb636f543ea-secret-volume\") pod \"3e37eff5-3e56-4a7b-bf6f-2eb636f543ea\" (UID: \"3e37eff5-3e56-4a7b-bf6f-2eb636f543ea\") " Mar 18 12:01:30 crc kubenswrapper[4965]: I0318 12:01:30.556054 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzrvm\" (UniqueName: \"kubernetes.io/projected/3e37eff5-3e56-4a7b-bf6f-2eb636f543ea-kube-api-access-zzrvm\") pod \"3e37eff5-3e56-4a7b-bf6f-2eb636f543ea\" (UID: \"3e37eff5-3e56-4a7b-bf6f-2eb636f543ea\") " Mar 18 12:01:30 crc kubenswrapper[4965]: I0318 12:01:30.556896 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e37eff5-3e56-4a7b-bf6f-2eb636f543ea-config-volume" (OuterVolumeSpecName: "config-volume") pod "3e37eff5-3e56-4a7b-bf6f-2eb636f543ea" (UID: "3e37eff5-3e56-4a7b-bf6f-2eb636f543ea"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:01:30 crc kubenswrapper[4965]: I0318 12:01:30.561777 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e37eff5-3e56-4a7b-bf6f-2eb636f543ea-kube-api-access-zzrvm" (OuterVolumeSpecName: "kube-api-access-zzrvm") pod "3e37eff5-3e56-4a7b-bf6f-2eb636f543ea" (UID: "3e37eff5-3e56-4a7b-bf6f-2eb636f543ea"). InnerVolumeSpecName "kube-api-access-zzrvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:01:30 crc kubenswrapper[4965]: I0318 12:01:30.562354 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e37eff5-3e56-4a7b-bf6f-2eb636f543ea-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3e37eff5-3e56-4a7b-bf6f-2eb636f543ea" (UID: "3e37eff5-3e56-4a7b-bf6f-2eb636f543ea"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:01:30 crc kubenswrapper[4965]: I0318 12:01:30.657085 4965 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e37eff5-3e56-4a7b-bf6f-2eb636f543ea-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 12:01:30 crc kubenswrapper[4965]: I0318 12:01:30.657115 4965 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e37eff5-3e56-4a7b-bf6f-2eb636f543ea-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 12:01:30 crc kubenswrapper[4965]: I0318 12:01:30.657128 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzrvm\" (UniqueName: \"kubernetes.io/projected/3e37eff5-3e56-4a7b-bf6f-2eb636f543ea-kube-api-access-zzrvm\") on node \"crc\" DevicePath \"\"" Mar 18 12:01:31 crc kubenswrapper[4965]: I0318 12:01:31.101516 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-x7krv" event={"ID":"3e37eff5-3e56-4a7b-bf6f-2eb636f543ea","Type":"ContainerDied","Data":"07edb7d1a378f4b1abaa5299df5df6d9a224af8f978eacc5891dd2bafa12a08d"} Mar 18 12:01:31 crc kubenswrapper[4965]: I0318 12:01:31.101554 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07edb7d1a378f4b1abaa5299df5df6d9a224af8f978eacc5891dd2bafa12a08d" Mar 18 12:01:31 crc kubenswrapper[4965]: I0318 12:01:31.101618 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-x7krv" Mar 18 12:01:31 crc kubenswrapper[4965]: I0318 12:01:31.119573 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hkp58" event={"ID":"46ebfdaa-4889-4dda-849c-58e0c672d243","Type":"ContainerStarted","Data":"a6aa46f53df3a85d6b806ab03009ac0b71b718bed32894b4fe7c0435dac1c324"} Mar 18 12:01:31 crc kubenswrapper[4965]: I0318 12:01:31.147003 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hkp58" podStartSLOduration=2.5920036189999998 podStartE2EDuration="5.146985393s" podCreationTimestamp="2026-03-18 12:01:26 +0000 UTC" firstStartedPulling="2026-03-18 12:01:28.033216918 +0000 UTC m=+293.019404397" lastFinishedPulling="2026-03-18 12:01:30.588198692 +0000 UTC m=+295.574386171" observedRunningTime="2026-03-18 12:01:31.145461368 +0000 UTC m=+296.131648847" watchObservedRunningTime="2026-03-18 12:01:31.146985393 +0000 UTC m=+296.133172872" Mar 18 12:01:34 crc kubenswrapper[4965]: I0318 12:01:34.032485 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rqmvm" Mar 18 12:01:34 crc kubenswrapper[4965]: I0318 12:01:34.032840 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rqmvm" Mar 18 12:01:34 crc kubenswrapper[4965]: I0318 12:01:34.075822 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rqmvm" Mar 18 12:01:34 crc kubenswrapper[4965]: I0318 12:01:34.169118 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rqmvm" Mar 18 12:01:34 crc kubenswrapper[4965]: I0318 12:01:34.263701 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-v6sqx" Mar 18 12:01:34 crc kubenswrapper[4965]: I0318 12:01:34.263767 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-v6sqx" Mar 18 12:01:34 crc kubenswrapper[4965]: I0318 12:01:34.322455 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-v6sqx" Mar 18 12:01:35 crc kubenswrapper[4965]: I0318 12:01:35.179473 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-v6sqx" Mar 18 12:01:36 crc kubenswrapper[4965]: I0318 12:01:36.152872 4965 generic.go:334] "Generic (PLEG): container finished" podID="b905f912-a71c-4b94-b3d0-5e7d1c0f3447" containerID="c5f9ca20ed634432788a9ac81efe842ddaa668e23a1aef428c53463e9860f0a5" exitCode=0 Mar 18 12:01:36 crc kubenswrapper[4965]: I0318 12:01:36.152946 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563920-8l9zs" event={"ID":"b905f912-a71c-4b94-b3d0-5e7d1c0f3447","Type":"ContainerDied","Data":"c5f9ca20ed634432788a9ac81efe842ddaa668e23a1aef428c53463e9860f0a5"} Mar 18 12:01:36 crc kubenswrapper[4965]: I0318 12:01:36.436432 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nll29" Mar 18 12:01:36 crc kubenswrapper[4965]: I0318 12:01:36.436478 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nll29" Mar 18 12:01:36 crc kubenswrapper[4965]: I0318 12:01:36.477025 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nll29" Mar 18 12:01:36 crc kubenswrapper[4965]: I0318 12:01:36.664763 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hkp58" Mar 18 12:01:36 crc kubenswrapper[4965]: I0318 12:01:36.664824 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hkp58" Mar 18 12:01:36 crc kubenswrapper[4965]: I0318 12:01:36.744832 4965 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hkp58" Mar 18 12:01:37 crc kubenswrapper[4965]: I0318 12:01:37.197381 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nll29" Mar 18 12:01:37 crc kubenswrapper[4965]: I0318 12:01:37.228194 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hkp58" Mar 18 12:01:37 crc kubenswrapper[4965]: I0318 12:01:37.450227 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563920-8l9zs" Mar 18 12:01:37 crc kubenswrapper[4965]: I0318 12:01:37.541780 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzmrh\" (UniqueName: \"kubernetes.io/projected/b905f912-a71c-4b94-b3d0-5e7d1c0f3447-kube-api-access-gzmrh\") pod \"b905f912-a71c-4b94-b3d0-5e7d1c0f3447\" (UID: \"b905f912-a71c-4b94-b3d0-5e7d1c0f3447\") " Mar 18 12:01:37 crc kubenswrapper[4965]: I0318 12:01:37.547694 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b905f912-a71c-4b94-b3d0-5e7d1c0f3447-kube-api-access-gzmrh" (OuterVolumeSpecName: "kube-api-access-gzmrh") pod "b905f912-a71c-4b94-b3d0-5e7d1c0f3447" (UID: "b905f912-a71c-4b94-b3d0-5e7d1c0f3447"). InnerVolumeSpecName "kube-api-access-gzmrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:01:37 crc kubenswrapper[4965]: I0318 12:01:37.643439 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzmrh\" (UniqueName: \"kubernetes.io/projected/b905f912-a71c-4b94-b3d0-5e7d1c0f3447-kube-api-access-gzmrh\") on node \"crc\" DevicePath \"\"" Mar 18 12:01:38 crc kubenswrapper[4965]: I0318 12:01:38.169909 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563920-8l9zs" event={"ID":"b905f912-a71c-4b94-b3d0-5e7d1c0f3447","Type":"ContainerDied","Data":"e74d79d3ea971cb2d0b9df47d041687cb5dc29f42e6742e8bff758aad824d400"} Mar 18 12:01:38 crc kubenswrapper[4965]: I0318 12:01:38.169969 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e74d79d3ea971cb2d0b9df47d041687cb5dc29f42e6742e8bff758aad824d400" Mar 18 12:01:38 crc kubenswrapper[4965]: I0318 12:01:38.169996 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563920-8l9zs" Mar 18 12:01:41 crc kubenswrapper[4965]: I0318 12:01:41.772810 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-5x8mh"] Mar 18 12:01:41 crc kubenswrapper[4965]: E0318 12:01:41.773357 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e37eff5-3e56-4a7b-bf6f-2eb636f543ea" containerName="collect-profiles" Mar 18 12:01:41 crc kubenswrapper[4965]: I0318 12:01:41.773374 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e37eff5-3e56-4a7b-bf6f-2eb636f543ea" containerName="collect-profiles" Mar 18 12:01:41 crc kubenswrapper[4965]: E0318 12:01:41.773387 4965 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b905f912-a71c-4b94-b3d0-5e7d1c0f3447" containerName="oc" Mar 18 12:01:41 crc kubenswrapper[4965]: I0318 12:01:41.773395 4965 state_mem.go:107] "Deleted CPUSet assignment" podUID="b905f912-a71c-4b94-b3d0-5e7d1c0f3447" containerName="oc" Mar 18 12:01:41 crc kubenswrapper[4965]: I0318 12:01:41.773527 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="b905f912-a71c-4b94-b3d0-5e7d1c0f3447" containerName="oc" Mar 18 12:01:41 crc kubenswrapper[4965]: I0318 12:01:41.773544 4965 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e37eff5-3e56-4a7b-bf6f-2eb636f543ea" containerName="collect-profiles" Mar 18 12:01:41 crc kubenswrapper[4965]: I0318 12:01:41.774020 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-5x8mh" Mar 18 12:01:41 crc kubenswrapper[4965]: I0318 12:01:41.787045 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-5x8mh"] Mar 18 12:01:41 crc kubenswrapper[4965]: I0318 12:01:41.904285 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/52fdc537-4667-4352-9293-7a3c5657dc4f-installation-pull-secrets\") pod \"image-registry-66df7c8f76-5x8mh\" (UID: \"52fdc537-4667-4352-9293-7a3c5657dc4f\") " pod="openshift-image-registry/image-registry-66df7c8f76-5x8mh" Mar 18 12:01:41 crc kubenswrapper[4965]: I0318 12:01:41.904332 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/52fdc537-4667-4352-9293-7a3c5657dc4f-bound-sa-token\") pod \"image-registry-66df7c8f76-5x8mh\" (UID: \"52fdc537-4667-4352-9293-7a3c5657dc4f\") " pod="openshift-image-registry/image-registry-66df7c8f76-5x8mh" Mar 18 12:01:41 crc kubenswrapper[4965]: I0318 12:01:41.904362 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/52fdc537-4667-4352-9293-7a3c5657dc4f-trusted-ca\") pod \"image-registry-66df7c8f76-5x8mh\" (UID: \"52fdc537-4667-4352-9293-7a3c5657dc4f\") " pod="openshift-image-registry/image-registry-66df7c8f76-5x8mh" Mar 18 12:01:41 crc kubenswrapper[4965]: I0318 12:01:41.904384 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/52fdc537-4667-4352-9293-7a3c5657dc4f-registry-certificates\") pod \"image-registry-66df7c8f76-5x8mh\" (UID: \"52fdc537-4667-4352-9293-7a3c5657dc4f\") " pod="openshift-image-registry/image-registry-66df7c8f76-5x8mh" Mar 18 12:01:41 crc kubenswrapper[4965]: I0318 12:01:41.904530 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/52fdc537-4667-4352-9293-7a3c5657dc4f-ca-trust-extracted\") pod \"image-registry-66df7c8f76-5x8mh\" (UID: \"52fdc537-4667-4352-9293-7a3c5657dc4f\") " pod="openshift-image-registry/image-registry-66df7c8f76-5x8mh" Mar 18 12:01:41 crc kubenswrapper[4965]: I0318 12:01:41.904682 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qp2s\" (UniqueName: \"kubernetes.io/projected/52fdc537-4667-4352-9293-7a3c5657dc4f-kube-api-access-4qp2s\") pod \"image-registry-66df7c8f76-5x8mh\" (UID: \"52fdc537-4667-4352-9293-7a3c5657dc4f\") " pod="openshift-image-registry/image-registry-66df7c8f76-5x8mh" Mar 18 12:01:41 crc kubenswrapper[4965]: I0318 12:01:41.904719 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/52fdc537-4667-4352-9293-7a3c5657dc4f-registry-tls\") pod \"image-registry-66df7c8f76-5x8mh\" (UID: \"52fdc537-4667-4352-9293-7a3c5657dc4f\") " pod="openshift-image-registry/image-registry-66df7c8f76-5x8mh" Mar 18 12:01:41 crc kubenswrapper[4965]: I0318 12:01:41.904794 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-5x8mh\" (UID: \"52fdc537-4667-4352-9293-7a3c5657dc4f\") " pod="openshift-image-registry/image-registry-66df7c8f76-5x8mh" Mar 18 12:01:41 crc kubenswrapper[4965]: I0318 12:01:41.924949 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-5x8mh\" (UID: \"52fdc537-4667-4352-9293-7a3c5657dc4f\") " pod="openshift-image-registry/image-registry-66df7c8f76-5x8mh" Mar 18 12:01:42 crc kubenswrapper[4965]: I0318 12:01:42.005940 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/52fdc537-4667-4352-9293-7a3c5657dc4f-installation-pull-secrets\") pod \"image-registry-66df7c8f76-5x8mh\" (UID: \"52fdc537-4667-4352-9293-7a3c5657dc4f\") " pod="openshift-image-registry/image-registry-66df7c8f76-5x8mh" Mar 18 12:01:42 crc kubenswrapper[4965]: I0318 12:01:42.005992 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/52fdc537-4667-4352-9293-7a3c5657dc4f-bound-sa-token\") pod \"image-registry-66df7c8f76-5x8mh\" (UID: \"52fdc537-4667-4352-9293-7a3c5657dc4f\") " pod="openshift-image-registry/image-registry-66df7c8f76-5x8mh" Mar 18 12:01:42 crc kubenswrapper[4965]: I0318 12:01:42.006018 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/52fdc537-4667-4352-9293-7a3c5657dc4f-trusted-ca\") pod \"image-registry-66df7c8f76-5x8mh\" (UID: \"52fdc537-4667-4352-9293-7a3c5657dc4f\") " pod="openshift-image-registry/image-registry-66df7c8f76-5x8mh" Mar 18 12:01:42 crc kubenswrapper[4965]: I0318 12:01:42.006038 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/52fdc537-4667-4352-9293-7a3c5657dc4f-registry-certificates\") pod \"image-registry-66df7c8f76-5x8mh\" (UID: \"52fdc537-4667-4352-9293-7a3c5657dc4f\") " pod="openshift-image-registry/image-registry-66df7c8f76-5x8mh" Mar 18 12:01:42 crc kubenswrapper[4965]: I0318 12:01:42.006060 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/52fdc537-4667-4352-9293-7a3c5657dc4f-ca-trust-extracted\") pod \"image-registry-66df7c8f76-5x8mh\" (UID: \"52fdc537-4667-4352-9293-7a3c5657dc4f\") " pod="openshift-image-registry/image-registry-66df7c8f76-5x8mh" Mar 18 12:01:42 crc kubenswrapper[4965]: I0318 12:01:42.006084 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qp2s\" (UniqueName: \"kubernetes.io/projected/52fdc537-4667-4352-9293-7a3c5657dc4f-kube-api-access-4qp2s\") pod \"image-registry-66df7c8f76-5x8mh\" (UID: \"52fdc537-4667-4352-9293-7a3c5657dc4f\") " pod="openshift-image-registry/image-registry-66df7c8f76-5x8mh" Mar 18 12:01:42 crc kubenswrapper[4965]: I0318 12:01:42.006100 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/52fdc537-4667-4352-9293-7a3c5657dc4f-registry-tls\") pod \"image-registry-66df7c8f76-5x8mh\" (UID: \"52fdc537-4667-4352-9293-7a3c5657dc4f\") " pod="openshift-image-registry/image-registry-66df7c8f76-5x8mh" Mar 18 12:01:42 crc kubenswrapper[4965]: I0318 12:01:42.007856 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/52fdc537-4667-4352-9293-7a3c5657dc4f-ca-trust-extracted\") pod \"image-registry-66df7c8f76-5x8mh\" (UID: \"52fdc537-4667-4352-9293-7a3c5657dc4f\") " pod="openshift-image-registry/image-registry-66df7c8f76-5x8mh" Mar 18 12:01:42 crc kubenswrapper[4965]: I0318 12:01:42.007900 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/52fdc537-4667-4352-9293-7a3c5657dc4f-trusted-ca\") pod \"image-registry-66df7c8f76-5x8mh\" (UID: \"52fdc537-4667-4352-9293-7a3c5657dc4f\") " pod="openshift-image-registry/image-registry-66df7c8f76-5x8mh" Mar 18 12:01:42 crc kubenswrapper[4965]: I0318 12:01:42.008144 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/52fdc537-4667-4352-9293-7a3c5657dc4f-registry-certificates\") pod \"image-registry-66df7c8f76-5x8mh\" (UID: \"52fdc537-4667-4352-9293-7a3c5657dc4f\") " pod="openshift-image-registry/image-registry-66df7c8f76-5x8mh" Mar 18 12:01:42 crc kubenswrapper[4965]: I0318 12:01:42.012260 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/52fdc537-4667-4352-9293-7a3c5657dc4f-installation-pull-secrets\") pod \"image-registry-66df7c8f76-5x8mh\" (UID: \"52fdc537-4667-4352-9293-7a3c5657dc4f\") " pod="openshift-image-registry/image-registry-66df7c8f76-5x8mh" Mar 18 12:01:42 crc kubenswrapper[4965]: I0318 12:01:42.014056 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/52fdc537-4667-4352-9293-7a3c5657dc4f-registry-tls\") pod \"image-registry-66df7c8f76-5x8mh\" (UID: \"52fdc537-4667-4352-9293-7a3c5657dc4f\") " pod="openshift-image-registry/image-registry-66df7c8f76-5x8mh" Mar 18 12:01:42 crc kubenswrapper[4965]: I0318 12:01:42.024981 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qp2s\" (UniqueName: \"kubernetes.io/projected/52fdc537-4667-4352-9293-7a3c5657dc4f-kube-api-access-4qp2s\") pod \"image-registry-66df7c8f76-5x8mh\" (UID: \"52fdc537-4667-4352-9293-7a3c5657dc4f\") " pod="openshift-image-registry/image-registry-66df7c8f76-5x8mh" Mar 18 12:01:42 crc kubenswrapper[4965]: I0318 12:01:42.025986 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/52fdc537-4667-4352-9293-7a3c5657dc4f-bound-sa-token\") pod \"image-registry-66df7c8f76-5x8mh\" (UID: \"52fdc537-4667-4352-9293-7a3c5657dc4f\") " pod="openshift-image-registry/image-registry-66df7c8f76-5x8mh" Mar 18 12:01:42 crc kubenswrapper[4965]: I0318 12:01:42.089927 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-5x8mh" Mar 18 12:01:42 crc kubenswrapper[4965]: I0318 12:01:42.321514 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-5x8mh"] Mar 18 12:01:43 crc kubenswrapper[4965]: I0318 12:01:43.197597 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-5x8mh" event={"ID":"52fdc537-4667-4352-9293-7a3c5657dc4f","Type":"ContainerStarted","Data":"e5a9dee63b82ee8641b45725d8bcf6a31300f7d68db32c5234ef51050abbbe81"} Mar 18 12:01:43 crc kubenswrapper[4965]: I0318 12:01:43.197924 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-5x8mh" Mar 18 12:01:43 crc kubenswrapper[4965]: I0318 12:01:43.197941 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-5x8mh" event={"ID":"52fdc537-4667-4352-9293-7a3c5657dc4f","Type":"ContainerStarted","Data":"c811141b0884cab2d32e44cf4bcd74a54faad68f502da0a669a087486b4e01fd"} Mar 18 12:01:43 crc kubenswrapper[4965]: I0318 12:01:43.226080 4965 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-5x8mh" podStartSLOduration=2.226059135 podStartE2EDuration="2.226059135s" podCreationTimestamp="2026-03-18 12:01:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:01:43.221310102 +0000 UTC m=+308.207497581" watchObservedRunningTime="2026-03-18 12:01:43.226059135 +0000 UTC m=+308.212246614" Mar 18 12:02:00 crc kubenswrapper[4965]: I0318 12:02:00.139540 4965 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563922-vhnwf"] Mar 18 12:02:00 crc kubenswrapper[4965]: I0318 12:02:00.142393 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563922-vhnwf" Mar 18 12:02:00 crc kubenswrapper[4965]: I0318 12:02:00.144905 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:02:00 crc kubenswrapper[4965]: I0318 12:02:00.145088 4965 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:02:00 crc kubenswrapper[4965]: I0318 12:02:00.145619 4965 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-zz9vs" Mar 18 12:02:00 crc kubenswrapper[4965]: I0318 12:02:00.147921 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563922-vhnwf"] Mar 18 12:02:00 crc kubenswrapper[4965]: I0318 12:02:00.251220 4965 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v8gw\" (UniqueName: \"kubernetes.io/projected/cb6b749a-c62b-40bc-bdbe-a42fe3ececf9-kube-api-access-2v8gw\") pod \"auto-csr-approver-29563922-vhnwf\" (UID: \"cb6b749a-c62b-40bc-bdbe-a42fe3ececf9\") " pod="openshift-infra/auto-csr-approver-29563922-vhnwf" Mar 18 12:02:00 crc kubenswrapper[4965]: I0318 12:02:00.352347 4965 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2v8gw\" (UniqueName: \"kubernetes.io/projected/cb6b749a-c62b-40bc-bdbe-a42fe3ececf9-kube-api-access-2v8gw\") pod \"auto-csr-approver-29563922-vhnwf\" (UID: \"cb6b749a-c62b-40bc-bdbe-a42fe3ececf9\") " pod="openshift-infra/auto-csr-approver-29563922-vhnwf" Mar 18 12:02:00 crc kubenswrapper[4965]: I0318 12:02:00.378473 4965 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v8gw\" (UniqueName: \"kubernetes.io/projected/cb6b749a-c62b-40bc-bdbe-a42fe3ececf9-kube-api-access-2v8gw\") pod \"auto-csr-approver-29563922-vhnwf\" (UID: \"cb6b749a-c62b-40bc-bdbe-a42fe3ececf9\") " pod="openshift-infra/auto-csr-approver-29563922-vhnwf" Mar 18 12:02:00 crc kubenswrapper[4965]: I0318 12:02:00.476147 4965 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563922-vhnwf" Mar 18 12:02:01 crc kubenswrapper[4965]: I0318 12:02:01.005745 4965 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563922-vhnwf"] Mar 18 12:02:01 crc kubenswrapper[4965]: W0318 12:02:01.012334 4965 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb6b749a_c62b_40bc_bdbe_a42fe3ececf9.slice/crio-2e308f56c0c5b28d90306e562e50334ea013ef28c6f2d079e97e9ffebe5a2c21 WatchSource:0}: Error finding container 2e308f56c0c5b28d90306e562e50334ea013ef28c6f2d079e97e9ffebe5a2c21: Status 404 returned error can't find the container with id 2e308f56c0c5b28d90306e562e50334ea013ef28c6f2d079e97e9ffebe5a2c21 Mar 18 12:02:01 crc kubenswrapper[4965]: I0318 12:02:01.302002 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563922-vhnwf" event={"ID":"cb6b749a-c62b-40bc-bdbe-a42fe3ececf9","Type":"ContainerStarted","Data":"2e308f56c0c5b28d90306e562e50334ea013ef28c6f2d079e97e9ffebe5a2c21"} Mar 18 12:02:02 crc kubenswrapper[4965]: I0318 12:02:02.095871 4965 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-5x8mh" Mar 18 12:02:02 crc kubenswrapper[4965]: I0318 12:02:02.195856 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zmt4c"] Mar 18 12:02:03 crc kubenswrapper[4965]: I0318 12:02:03.315722 4965 generic.go:334] "Generic (PLEG): container finished" podID="cb6b749a-c62b-40bc-bdbe-a42fe3ececf9" containerID="d565433309fb29206c8791b7919be5c9702d4f8281fcf23d567bb693eb220702" exitCode=0 Mar 18 12:02:03 crc kubenswrapper[4965]: I0318 12:02:03.315824 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563922-vhnwf" event={"ID":"cb6b749a-c62b-40bc-bdbe-a42fe3ececf9","Type":"ContainerDied","Data":"d565433309fb29206c8791b7919be5c9702d4f8281fcf23d567bb693eb220702"} Mar 18 12:02:04 crc kubenswrapper[4965]: I0318 12:02:04.566326 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563922-vhnwf" Mar 18 12:02:04 crc kubenswrapper[4965]: I0318 12:02:04.728449 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2v8gw\" (UniqueName: \"kubernetes.io/projected/cb6b749a-c62b-40bc-bdbe-a42fe3ececf9-kube-api-access-2v8gw\") pod \"cb6b749a-c62b-40bc-bdbe-a42fe3ececf9\" (UID: \"cb6b749a-c62b-40bc-bdbe-a42fe3ececf9\") " Mar 18 12:02:04 crc kubenswrapper[4965]: I0318 12:02:04.736090 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb6b749a-c62b-40bc-bdbe-a42fe3ececf9-kube-api-access-2v8gw" (OuterVolumeSpecName: "kube-api-access-2v8gw") pod "cb6b749a-c62b-40bc-bdbe-a42fe3ececf9" (UID: "cb6b749a-c62b-40bc-bdbe-a42fe3ececf9"). InnerVolumeSpecName "kube-api-access-2v8gw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:02:04 crc kubenswrapper[4965]: I0318 12:02:04.830374 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2v8gw\" (UniqueName: \"kubernetes.io/projected/cb6b749a-c62b-40bc-bdbe-a42fe3ececf9-kube-api-access-2v8gw\") on node \"crc\" DevicePath \"\"" Mar 18 12:02:05 crc kubenswrapper[4965]: I0318 12:02:05.333587 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563922-vhnwf" event={"ID":"cb6b749a-c62b-40bc-bdbe-a42fe3ececf9","Type":"ContainerDied","Data":"2e308f56c0c5b28d90306e562e50334ea013ef28c6f2d079e97e9ffebe5a2c21"} Mar 18 12:02:05 crc kubenswrapper[4965]: I0318 12:02:05.333645 4965 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e308f56c0c5b28d90306e562e50334ea013ef28c6f2d079e97e9ffebe5a2c21" Mar 18 12:02:05 crc kubenswrapper[4965]: I0318 12:02:05.333704 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563922-vhnwf" Mar 18 12:02:27 crc kubenswrapper[4965]: I0318 12:02:27.240679 4965 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" podUID="ad0ada2e-efc3-4a75-b954-15e8b893e2bf" containerName="registry" containerID="cri-o://dea0017c6cd6eee4415187b0061d85592cca8cc67dc8ab41b96699adb9d7917b" gracePeriod=30 Mar 18 12:02:27 crc kubenswrapper[4965]: I0318 12:02:27.458414 4965 generic.go:334] "Generic (PLEG): container finished" podID="ad0ada2e-efc3-4a75-b954-15e8b893e2bf" containerID="dea0017c6cd6eee4415187b0061d85592cca8cc67dc8ab41b96699adb9d7917b" exitCode=0 Mar 18 12:02:27 crc kubenswrapper[4965]: I0318 12:02:27.458620 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" event={"ID":"ad0ada2e-efc3-4a75-b954-15e8b893e2bf","Type":"ContainerDied","Data":"dea0017c6cd6eee4415187b0061d85592cca8cc67dc8ab41b96699adb9d7917b"} Mar 18 12:02:27 crc kubenswrapper[4965]: I0318 12:02:27.573486 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 12:02:27 crc kubenswrapper[4965]: I0318 12:02:27.753421 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ad0ada2e-efc3-4a75-b954-15e8b893e2bf-ca-trust-extracted\") pod \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " Mar 18 12:02:27 crc kubenswrapper[4965]: I0318 12:02:27.753484 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad0ada2e-efc3-4a75-b954-15e8b893e2bf-bound-sa-token\") pod \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " Mar 18 12:02:27 crc kubenswrapper[4965]: I0318 12:02:27.753506 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ad0ada2e-efc3-4a75-b954-15e8b893e2bf-registry-certificates\") pod \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " Mar 18 12:02:27 crc kubenswrapper[4965]: I0318 12:02:27.753538 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfnlh\" (UniqueName: \"kubernetes.io/projected/ad0ada2e-efc3-4a75-b954-15e8b893e2bf-kube-api-access-vfnlh\") pod \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " Mar 18 12:02:27 crc kubenswrapper[4965]: I0318 12:02:27.753681 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " Mar 18 12:02:27 crc kubenswrapper[4965]: I0318 12:02:27.753729 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad0ada2e-efc3-4a75-b954-15e8b893e2bf-registry-tls\") pod \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " Mar 18 12:02:27 crc kubenswrapper[4965]: I0318 12:02:27.753753 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad0ada2e-efc3-4a75-b954-15e8b893e2bf-trusted-ca\") pod \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " Mar 18 12:02:27 crc kubenswrapper[4965]: I0318 12:02:27.753789 4965 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ad0ada2e-efc3-4a75-b954-15e8b893e2bf-installation-pull-secrets\") pod \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\" (UID: \"ad0ada2e-efc3-4a75-b954-15e8b893e2bf\") " Mar 18 12:02:27 crc kubenswrapper[4965]: I0318 12:02:27.755184 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad0ada2e-efc3-4a75-b954-15e8b893e2bf-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "ad0ada2e-efc3-4a75-b954-15e8b893e2bf" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:02:27 crc kubenswrapper[4965]: I0318 12:02:27.755737 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad0ada2e-efc3-4a75-b954-15e8b893e2bf-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "ad0ada2e-efc3-4a75-b954-15e8b893e2bf" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:02:27 crc kubenswrapper[4965]: I0318 12:02:27.772110 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad0ada2e-efc3-4a75-b954-15e8b893e2bf-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "ad0ada2e-efc3-4a75-b954-15e8b893e2bf" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:02:27 crc kubenswrapper[4965]: I0318 12:02:27.772190 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad0ada2e-efc3-4a75-b954-15e8b893e2bf-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "ad0ada2e-efc3-4a75-b954-15e8b893e2bf" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:02:27 crc kubenswrapper[4965]: I0318 12:02:27.772905 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad0ada2e-efc3-4a75-b954-15e8b893e2bf-kube-api-access-vfnlh" (OuterVolumeSpecName: "kube-api-access-vfnlh") pod "ad0ada2e-efc3-4a75-b954-15e8b893e2bf" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf"). InnerVolumeSpecName "kube-api-access-vfnlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:02:27 crc kubenswrapper[4965]: I0318 12:02:27.773087 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad0ada2e-efc3-4a75-b954-15e8b893e2bf-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "ad0ada2e-efc3-4a75-b954-15e8b893e2bf" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:02:27 crc kubenswrapper[4965]: I0318 12:02:27.776964 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad0ada2e-efc3-4a75-b954-15e8b893e2bf-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "ad0ada2e-efc3-4a75-b954-15e8b893e2bf" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:02:27 crc kubenswrapper[4965]: I0318 12:02:27.780211 4965 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "ad0ada2e-efc3-4a75-b954-15e8b893e2bf" (UID: "ad0ada2e-efc3-4a75-b954-15e8b893e2bf"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 12:02:27 crc kubenswrapper[4965]: I0318 12:02:27.855198 4965 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad0ada2e-efc3-4a75-b954-15e8b893e2bf-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 12:02:27 crc kubenswrapper[4965]: I0318 12:02:27.855239 4965 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ad0ada2e-efc3-4a75-b954-15e8b893e2bf-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 18 12:02:27 crc kubenswrapper[4965]: I0318 12:02:27.855252 4965 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfnlh\" (UniqueName: \"kubernetes.io/projected/ad0ada2e-efc3-4a75-b954-15e8b893e2bf-kube-api-access-vfnlh\") on node \"crc\" DevicePath \"\"" Mar 18 12:02:27 crc kubenswrapper[4965]: I0318 12:02:27.855269 4965 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad0ada2e-efc3-4a75-b954-15e8b893e2bf-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:02:27 crc kubenswrapper[4965]: I0318 12:02:27.855285 4965 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad0ada2e-efc3-4a75-b954-15e8b893e2bf-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:02:27 crc kubenswrapper[4965]: I0318 12:02:27.855296 4965 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ad0ada2e-efc3-4a75-b954-15e8b893e2bf-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 18 12:02:27 crc kubenswrapper[4965]: I0318 12:02:27.855307 4965 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ad0ada2e-efc3-4a75-b954-15e8b893e2bf-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 18 12:02:28 crc kubenswrapper[4965]: I0318 12:02:28.467407 4965 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" event={"ID":"ad0ada2e-efc3-4a75-b954-15e8b893e2bf","Type":"ContainerDied","Data":"27a6752d52e21dcbefacc51add472f8feec6d5863fc68e1e8c5aa9dfbccc2919"} Mar 18 12:02:28 crc kubenswrapper[4965]: I0318 12:02:28.467456 4965 scope.go:117] "RemoveContainer" containerID="dea0017c6cd6eee4415187b0061d85592cca8cc67dc8ab41b96699adb9d7917b" Mar 18 12:02:28 crc kubenswrapper[4965]: I0318 12:02:28.467596 4965 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zmt4c" Mar 18 12:02:28 crc kubenswrapper[4965]: I0318 12:02:28.492866 4965 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zmt4c"] Mar 18 12:02:28 crc kubenswrapper[4965]: I0318 12:02:28.501184 4965 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zmt4c"] Mar 18 12:02:30 crc kubenswrapper[4965]: I0318 12:02:30.027204 4965 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad0ada2e-efc3-4a75-b954-15e8b893e2bf" path="/var/lib/kubelet/pods/ad0ada2e-efc3-4a75-b954-15e8b893e2bf/volumes" Mar 18 12:03:11 crc kubenswrapper[4965]: I0318 12:03:11.602496 4965 patch_prober.go:28] interesting pod/machine-config-daemon-8l67n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:03:11 crc kubenswrapper[4965]: I0318 12:03:11.603201 4965 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8l67n" podUID="e9a53215-1d0d-47de-92c4-cea3209fe4fa" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"